Science.gov

Sample records for acquisition sample processing

  1. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  2. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  3. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  4. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality of the System after being on the Surface for Two Years.

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Abbey, W. J.

    2014-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system. SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Curiosity rover preformed several sample acquisitions and processing of solid samples during its first year of operation. Material were processed and delivered to the two analytical instruments, Chemistry and Mineralogy (CheMin) and Sample Analysis at Mars (SAM), both of which required specific particle size for the material delivered to them to perform their analysis to determine its mineralogy and geochemistry content. In this presentation, the functionality of the system will be explained along with the in-situ targets the system has acquire and the samples that were delivered.

  5. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; Slimko, Eric; Limonadi, Daniel

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  6. Rockballer Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Giersch, Louis R.; Cook, Brant T.

    2013-01-01

    It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.

  7. Resource Prospector Instrumentation for Lunar Volatiles Prospecting, Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Captain, J.; Elphic, R.; Colaprete, A.; Zacny, Kris; Paz, A.

    2016-01-01

    the traverse path. The NS will map the water-equivalent hydrogen concentration as low as 0.5% by weight to an 80 centimeter depth as the rover traverses the lunar landscape. The NIR spectrometer will measure surficial H2O/OH as well as general mineralogy. When the prospecting instruments identify a potential volatile-rich area during the course of a traverse, the prospect is then mapped out and the most promising location identified. An augering drill capable of sampling to a depth of 100 centimeters will excavate regolith for analysis. A quick assay of the drill cuttings will be made using an operations camera and NIR spectrometer. With the water depth confirmed by this first auguring activity, a regolith sample may be extracted for processing. The drill will deliver the regolith sample to a crucible that will be sealed and heated. Evolved volatiles will be measured by a gas chromatograph-mass spectrometer and the water will be captured and photographed. RP is a solar powered mission, which given the polar location translates to a relatively short mission duration on the order of 4-15 days. This short mission duration drives the concept of operations, instrumentation, and data analysis towards critical real time analysis and decision support. Previous payload field tests have increased the fidelity of the hardware, software, and mission operations. Current activities include a mission level field test to optimize interfaces between the payload and rover as well as better understand the interaction of the science and rover teams during the mission timeline. This paper will include the current status of the science instruments on the payload as well as the integrated field test occurring in fall of 2015. The concept of operations will be discussed, including the real time science and engineering decision-making process based on the critical data from the instrumentation. The path to flight will be discussed with the approach to this ambitious low cost mission.

  8. Sample Acquisition Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Carle, Glenn C.; Stratton, David M.; Valentin, Jose R.; DeVincenzi, Donald (Technical Monitor)

    1999-01-01

    Exobiology Flight Experiments involve complex analyses conducted in environments far different than those encountered in terrestrial applications. A major part of the analytical challenge is often the selection, acquisition, delivery and, in some cases, processing of a sample suitable for the analytical requirements of the mission. The added complications of severely limited resources and sometimes rigid time constraints combine to make sample acquisition potentially a major obstacle for successful analyses. Potential samples come in a wide range including planetary atmospheric gas and aerosols (from a wide variety of pressures), planetary soil or rocks, dust and ice particles streaming off of a comet, and cemetery surface ice and rocks. Methods to collect and process sample are often mission specific, requiring continual development of innovative concepts and mechanisms. These methods must also maintain the integrity of the sample for the experimental results to be meaningful. We present here sample acquisition systems employed from past missions and proposed for future missions.

  9. Rotary Percussive Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Klein, K.; Badescu, M.; Haddad, N.; Shiraishi, L.; Walkemeyer, P.

    2012-01-01

    As part of a potential Mars Sample Return campaign NASA is studying a sample caching mission to Mars, with a possible 2018 launch opportunity. As such, a Sample Acquisition Tool (SAT) has been developed in support of the Integrated Mars Sample Acquisition and Handling (IMSAH) architecture as it relates to the proposed Mars Sample Return (MSR) campaign. The tool allows for core generation and capture directly into a sample tube. In doing so, the sample tube becomes the fundamental handling element within the IMSAH sample chain reducing the risk associated with sample contamination as well as the need to handle a sample of unknown geometry. The tool's functionality was verified utilizing a proposed rock test suite that encompasses a series of rock types that have been utilized in the past to qualify Martian surface sampling hardware. The corresponding results have shown the tool can effectively generate, fracture, and capture rock cores while maintaining torque margins of no less than 50% with an average power consumption of no greater than 90W and a tool mass of less than 6kg.

  10. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  11. Sample acquisition and instrument deployment

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1995-01-01

    Progress is reported in developing the Sample Acquisition and Instrument Deployment (SAID) system, a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. The systems have been fabricated and tested in environmental chambers, as well as soil testing and robotic control testing.

  12. The alteration of icy samples during sample acquisition

    NASA Astrophysics Data System (ADS)

    Mungas, G.; Bearman, G.; Beegle, L. W.; Hecht, M.; Peters, G. H.; Glucoft, J.; Strothers, K.

    2006-12-01

    Valid in situ scientific studies require both that samples be analyzed in as pristine condition as possible and that any modification from the pristine to the sampled state be well understood. While samples with low to high ice concentration are critical for the study of astrobiology and geology, they pose problems with respect to the sample acquisition, preparation and distribution systems (SPAD) upon which the analytical instruments depend. Most significant of the processes that occur during SPAD is sublimation or melting caused by thermal loading from drilling, coring, etc. as well as exposure to a dry low pressure ambient environment. These processes can alter the sample, as well as generating, meta-stable liquid water that can refreeze in the sample transfer mechanisms, interfering with proper operation and creating cross-contamination. We have investigated and quantified loss of volatiles such as H2O, CO, CO2, and organics contained within icy and powdered samples when acquired, processed and transferred. During development of the MSL rock crusher, for example, ice was observed to pressure-fuse and stick to the side even at -70C. We have investigated sublimation from sample acquisition at Martian temperature and pressure for a samples ranging from 10 to 100 water/dirt ratios. Using the RASP that will be on Phoenix, we have measured sublimation of ice during excavation at Martian pressure and find that the sublimation losses can range from 10 to 50 percent water. It is the thermal conductivity of the soil that determines local heat transport, and how much of the sample acquisition energy is wicked away into the soil and how much goes into the sample. Modeling of sample acquisition methods requires measurement of these parameters. There is a two phase model for thermal conductivity as a function of dirt/ice ratio but it needed to be validated. We used an ASTM method for measuring thermal conductivity and implemented it in the laboratory. The major conclusion is

  13. SNAP: Simulating New Acquisition Processes

    NASA Technical Reports Server (NTRS)

    Alfeld, Louis E.

    1997-01-01

    Simulation models of acquisition processes range in scope from isolated applications to the 'Big Picture' captured by SNAP technology. SNAP integrates a family of models to portray the full scope of acquisition planning and management activities, including budgeting, scheduling, testing and risk analysis. SNAP replicates the dynamic management processes that underlie design, production and life-cycle support. SNAP provides the unique 'Big Picture' capability needed to simulate the entire acquisition process and explore the 'what-if' tradeoffs and consequences of alternative policies and decisions. Comparison of cost, schedule and performance tradeoffs help managers choose the lowest-risk, highest payoff at each step in the acquisition process.

  14. Data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Tsuda, Toshitaka

    1989-10-01

    Fundamental methods of signal processing used in normal mesosphere stratosphere troposphere (MST) radar observations are described. Complex time series of received signals obtained in each range gate are converted into Doppler spectra, from which the mean Doppler shift, spectral width and signal-to-noise ratio (SNR) are estimated. These spectral parameters are further utilized to study characteristics of scatterers and atmospheric motions.

  15. Improving the Acquisition and Management of Sample Curation Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  16. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  17. Planetary protection issues for Mars sample acquisition flight projects.

    PubMed

    Barengoltz, J B

    2000-01-01

    The planned NASA sample acquisition flight missions to Mars pose several interesting planetary protection issues. In addition to the usual forward contamination procedures for the adequate protection of Mars for the sake of future missions, there are reasons to ensure that the sample is not contaminated by terrestrial microbes from the acquisition mission. Recent recommendations by the Space Studies Board (SSB) of the National Research Council (United States), would indicate that the scientific integrity of the sample is a planetary protection concern (SSB, 1997). Also, as a practical matter, a contaminated sample would interfere with the process for its release from quarantine after return for distribution to the interested scientists. These matters are discussed in terms of the first planned acquisition mission. PMID:12038483

  18. Sample Acquisition and Instrument Deployment (SAID)

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1994-01-01

    This report details the interim progress for contract NASW-4818, Sample Acquisition and Instrument Deployment (SAID), a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. A passively braked shape memory actuator with the ability to measure load has been developed. The wrist also contains a mechanism which locks the lid output to the bucket so that objects can be grasped and released for instrument deployment. The wrist actuator has been tested for operational power and mechanical functionality at Mars environmental conditions. The torque which the actuator can produce has been measured. Also, testing in Mars analogous soils has been performed.

  19. Autonomous Surface Sample Acquisition for Planetary and Lunar Exploration

    NASA Astrophysics Data System (ADS)

    Barnes, D. P.

    2007-08-01

    Surface science sample acquisition is a critical activity within any planetary and lunar exploration mission, and our research is focused upon the design, implementation, experimentation and demonstration of an onboard autonomous surface sample acquisition capability for a rover equipped with a robotic arm upon which are mounted appropriate science instruments. Images captured by a rover stereo camera system can be processed using shape from stereo methods and a digital elevation model (DEM) generated. We have developed a terrain feature identification algorithm that can determine autonomously from DEM data suitable regions for instrument placement and/or surface sample acquisition. Once identified, surface normal data can be generated autonomously which are then used to calculate an arm trajectory for instrument placement and sample acquisition. Once an instrument placement and sample acquisition trajectory has been calculated, a collision detection algorithm is required to ensure the safe operation of the arm during sample acquisition.We have developed a novel adaptive 'bounding spheres' approach to this problem. Once potential science targets have been identified, and these are within the reach of the arm and will not cause any undesired collision, then the 'cost' of executing the sample acquisition activity is required. Such information which includes power expenditure and duration can be used to select the 'best' target from a set of potential targets. We have developed a science sample acquisition resource requirements calculation that utilises differential inverse kinematics methods to yield a high fidelity result, thus improving upon simple 1st order approximations. To test our algorithms a new Planetary Analogue Terrain (PAT) Laboratory has been created that has a terrain region composed of Mars Soil Simulant-D from DLR Germany, and rocks that have been fully characterised in the laboratory. These have been donated by the UK Planetary Analogue Field Study

  20. Harpoon-based sample Acquisition System

    NASA Astrophysics Data System (ADS)

    Bernal, Javier; Nuth, Joseph; Wegel, Donald

    2012-02-01

    Acquiring information about the composition of comets, asteroids, and other near Earth objects is very important because they may contain the primordial ooze of the solar system and the origins of life on Earth. Sending a spacecraft is the obvious answer, but once it gets there it needs to collect and analyze samples. Conceptually, a drill or a shovel would work, but both require something extra to anchor it to the comet, adding to the cost and complexity of the spacecraft. Since comets and asteroids are very low gravity objects, drilling becomes a problem. If you do not provide a grappling mechanism, the drill would push the spacecraft off the surface. Harpoons have been proposed as grappling mechanisms in the past and are currently flying on missions such as ROSETTA. We propose to use a hollow, core sampling harpoon, to act as the anchoring mechanism as well as the sample collecting device. By combining these two functions, mass is reduced, more samples can be collected and the spacecraft can carry more propellant. Although challenging, returning the collected samples to Earth allows them to be analyzed in laboratories with much greater detail than possible on a spacecraft. Also, bringing the samples back to Earth allows future generations to study them.

  1. Personal computer process data acquisition

    SciTech Connect

    Dworjanyn, L.O. )

    1989-01-01

    A simple Basic program was written to permit personal computer data collection of process temperatures, pressures, flows, and inline analyzer outputs for a batch-type, unit operation. The field voltage outputs were read on a IEEE programmable digital multimeter using a programmable scanner to select different output lines. The data were stored in ASCII format to allow direct analysis by spreadsheet programs. 1 fig., 1 tab.

  2. Comet sample acquisition for ROSETTA lander mission

    NASA Astrophysics Data System (ADS)

    Marchesi, M.; Campaci, R.; Magnani, P.; Mugnuolo, R.; Nista, A.; Olivier, A.; Re, E.

    2001-09-01

    ROSETTA/Lander is being developed with a combined effort of European countries, coordinated by German institutes. The commitment for such a challenging probe will provide a unique opportunity for in-situ analysis of a comet nucleus. The payload for coring, sampling and investigations of comet materials is called SD2 (Sampling Drilling and Distribution). The paper presents the drill/sampler tool and the sample transfer trough modeling, design and testing phases. Expected drilling parameters are then compared with experimental data; limited torque consumption and axial thrust on the tool constraint the operation and determine the success of tests. Qualification campaign involved the structural part and related vibration test, the auger/bit parts and drilling test, and the coring mechanism with related sampling test. Mechanical check of specimen volume is also reported, with emphasis on the measurement procedure and on the mechanical unit. The drill tool and all parts of the transfer chain were tested in the hypothetical comet environment, charcterized by frozen material at extreme low temperature and high vacuum (-160°C, 10-3 Pa).

  3. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  4. Generalized analog thresholding for spike acquisition at ultralow sampling rates.

    PubMed

    He, Bryan D; Wein, Alex; Varshney, Lav R; Kusuma, Julius; Richardson, Andrew G; Srinivasan, Lakshminarayan

    2015-07-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  5. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  6. Processability Theory and German Case Acquisition

    ERIC Educational Resources Information Center

    Baten, Kristof

    2011-01-01

    This article represents the first attempt to formulate a hypothetical sequence for German case acquisition by Dutch-speaking learners on the basis of Processability Theory (PT). It will be argued that case forms emerge corresponding to a development from lexical over phrasal to interphrasal morphemes. This development, however, is subject to a…

  7. Rotary Percussive Sample Acquisition Tool (SAT): Hardware Development and Testing

    NASA Technical Reports Server (NTRS)

    Klein, Kerry; Badescu, Mircea; Haddad, Nicolas; Shiraishi, Lori; Walkemeyer, Phillip

    2012-01-01

    In support of a potential Mars Sample Return (MSR) mission an Integrated Mars Sample Acquisition and Handling (IMSAH) architecture has been proposed to provide a means for Rover-based end-to-end sample capture and caching. A key enabling feature of the architecture is the use of a low mass sample Acquisition Tool (SAT) that is capable of drilling and capturing rock cores directly within a sample tube in order to maintain sample integrity and prevent contamination across the sample chain. As such, this paper will describe the development and testing of a low mass rotary percussive SAT that has been shown to provide a means for core generation, fracture, and capture.

  8. Acquisition and Retaining Granular Samples via a Rotating Coring Bit

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart

    2013-01-01

    This device takes advantage of the centrifugal forces that are generated when a coring bit is rotated, and a granular sample is entered into the bit while it is spinning, making it adhere to the internal wall of the bit, where it compacts itself into the wall of the bit. The bit can be specially designed to increase the effectiveness of regolith capturing while turning and penetrating the subsurface. The bit teeth can be oriented such that they direct the regolith toward the bit axis during the rotation of the bit. The bit can be designed with an internal flute that directs the regolith upward inside the bit. The use of both the teeth and flute can be implemented in the same bit. The bit can also be designed with an internal spiral into which the various particles wedge. In another implementation, the bit can be designed to collect regolith primarily from a specific depth. For that implementation, the bit can be designed such that when turning one way, the teeth guide the regolith outward of the bit and when turning in the opposite direction, the teeth will guide the regolith inward into the bit internal section. This mechanism can be implemented with or without an internal flute. The device is based on the use of a spinning coring bit (hollow interior) as a means of retaining granular sample, and the acquisition is done by inserting the bit into the subsurface of a regolith, soil, or powder. To demonstrate the concept, a commercial drill and a coring bit were used. The bit was turned and inserted into the soil that was contained in a bucket. While spinning the bit (at speeds of 600 to 700 RPM), the drill was lifted and the soil was retained inside the bit. To prove this point, the drill was turned horizontally, and the acquired soil was still inside the bit. The basic theory behind the process of retaining unconsolidated mass that can be acquired by the centrifugal forces of the bit is determined by noting that in order to stay inside the interior of the bit, the

  9. Major system acquisitions process (A-109)

    NASA Technical Reports Server (NTRS)

    Saric, C.

    1991-01-01

    The Major System examined is a combination of elements (hardware, software, facilities, and services) that function together to produce capabilities required to fulfill a mission need. The system acquisition process is a sequence of activities beginning with documentation of mission need and ending with introduction of major system into operational use or otherwise successful achievement of program objectives. It is concluded that the A-109 process makes sense and provides a systematic, integrated management approach along with appropriate management level involvement and innovative and 'best ideas' from private sector in satisfying mission needs.

  10. FPGA Based Data Acquisition and Processing for Gamma Ray Tomography

    NASA Astrophysics Data System (ADS)

    Schlaberg, H. Inaki; Li, Donghui; Wu, Yingxiang; Wang, Mi

    2007-06-01

    Data acquisition and processing for gamma ray tomography has traditionally been performed with analogue electronic circuitry. Detectors convert the received photons into electrical signals which are then shaped and conditioned for the next counting stage. An approach of using a FPGA (Field programmable gate array) based data acquisition and processing system for gamma ray tomography is presented in this paper. With recently introduced low cost high speed analogue to digital converters and digital signal processors the electrical output of the detectors can be converted into the digital domain with only simple analogue signal conditioning. This step can significantly reduce the amount of components and the size of the instrument as much of the analogue processing circuitry is eliminated. To count the number of incident photons from the converted electrical signal, a peak detection algorithm can be developed for the DSP (Digital Signal Processor). However due to the relatively high sample rate the consequently low number of available of processor cycles to process the sample makes it more effective to implement a peak detection algorithm on the FPGA. This paper presents the development of the acquisition system hardware and simulation results of the peak detection with previously recorded experimental data on a flow loop.

  11. 28. Perimeter acquisition radar building room #302, signal process and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. Perimeter acquisition radar building room #302, signal process and analog receiver room - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  12. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  13. Sample Acquisition and Handling System from a Remote Platform

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Sherrit, Stewart; Jones, Jack A.

    2011-01-01

    A system has been developed to acquire and handle samples from a suspended remote platform. The system includes a penetrator, a penetrator deployment mechanism, and a sample handler. A gravity-driven harpoon sampler was used for the system, but other solutions can be used to supply the penetration energy, such as pyrotechnic, pressurized gas, or springs. The deployment mechanism includes a line that is attached to the penetrator, a spool for reeling in the line, and a line engagement control mechanism. The penetrator has removable tips that can collect liquid, ice, or solid samples. The handling mechanism consists of a carousel that can store a series of identical or different tips, assist in penetrator reconfiguration for multiple sample acquisition, and deliver the sample to a series of instruments for analysis. The carousel sample handling system was combined with a brassboard reeling mechanism and a penetrator with removable tips. It can attach the removable tip to the penetrator, release and retrieve the penetrator, remove the tip, and present it to multiple instrument stations. The penetrator can be remotely deployed from an aerobot, penetrate and collect the sample, and be retrieved with the sample to the aerobot. The penetrator with removable tips includes sample interrogation windows and a sample retainment spring for unconsolidated samples. The line engagement motor can be used to control the penetrator release and reeling engagement, and to evenly distribute the line on the spool by rocking between left and right ends of the spool. When the arm with the guiding ring is aligned with the spool axis, the line is free to unwind from the spool without rotating the spool. When the arm is perpendicular to the spool axis, the line can move only if the spool rotates.

  14. IWTU Process Sample Analysis Report

    SciTech Connect

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  15. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  16. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  17. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  18. Collecting cometary soil samples? Development of the ROSETTA sample acquisition system

    NASA Technical Reports Server (NTRS)

    Coste, P. A.; Fenzi, M.; Eiden, Michael

    1993-01-01

    In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.

  19. Feasibility Study of Commercial Markets for New Sample Acquisition Devices

    NASA Technical Reports Server (NTRS)

    Brady, Collin; Coyne, Jim; Bilen, Sven G.; Kisenwether, Liz; Miller, Garry; Mueller, Robert P.; Zacny, Kris

    2010-01-01

    The NASA Exploration Systems Mission Directorate (ESMD) and Penn State technology commercialization project was designed to assist in the maturation of a NASA SBIR Phase III technology. The project was funded by NASA's ESMD Education group with oversight from the Surface Systems Office at NASA Kennedy Space Center in the Engineering Directorate. Two Penn State engineering student interns managed the project with support from Honeybee Robotics and NASA Kennedy Space Center. The objective was to find an opportunity to integrate SBIR-developed Regolith Extractor and Sampling Technology as the payload for the future Lunar Lander or Rover missions. The team was able to identify two potential Google Lunar X Prize organizations with considerable interest in utilizing regolith acquisition and transfer technology.

  20. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  1. Dynamic Acquisition and Retrieval Tool (DART) for Comet Sample Return : Session: 2.06.Robotic Mobility and Sample Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Bonitz, Robert; Kulczycki, Erick; Aisen, Norman; Dandino, Charles M.; Cantrell, Brett S.; Gallagher, William; Shevin, Jesse; Ganino, Anthony; Haddad, Nicolas; Walkemeyer, Phillip; Backes, Paul; Shiraishi, Lori

    2013-01-01

    The 2011 Decadal Survey for planetary science released by the National Research Council of the National Academies identified Comet Surface Sample Return (CSSR) as one of five high priority potential New Frontiers-class missions in the next decade. The main objectives of the research described in this publication are: develop a concept for an end-to-end system for collecting and storing a comet sample to be returned to Earth; design, fabricate and test a prototype Dynamic Acquisition and Retrieval Tool (DART) capable of collecting 500 cc sample in a canister and eject the canister with a predetermined speed; identify a set of simulants with physical properties at room temperature that suitably match the physical properties of the comet surface as it would be sampled. We propose the use of a dart that would be launched from the spacecraft to impact and penetrate the comet surface. After collecting the sample, the sample canister would be ejected at a speed greater than the comet's escape velocity and captured by the spacecraft, packaged into a return capsule and returned to Earth. The dart would be composed of an inner tube or sample canister, an outer tube, a decelerator, a means of capturing and retaining the sample, and a mechanism to eject the canister with the sample for later rendezvous with the spacecraft. One of the significant unknowns is the physical properties of the comet surface. Based on new findings from the recent Deep Impact comet encounter mission, we have limited our search of solutions for sampling materials to materials with 10 to 100 kPa shear strength in loose or consolidated form. As the possible range of values for the comet surface temperature is also significantly different than room temperature and testing at conditions other than the room temperature can become resource intensive, we sought sample simulants with physical properties at room temperature similar to the expected physical properties of the comet surface material. The chosen

  2. Second Language Vocabulary Acquisition: A Lexical Input Processing Approach

    ERIC Educational Resources Information Center

    Barcroft, Joe

    2004-01-01

    This article discusses the importance of vocabulary in second language acquisition (SLA), presents an overview of major strands of research on vocabulary acquisition, and discusses five principles for effective second language (L2) vocabulary instruction based on research findings on lexical input processing. These principles emphasize…

  3. System of acquisition and processing of images of dynamic speckle

    NASA Astrophysics Data System (ADS)

    Vega, F.; >C Torres,

    2015-01-01

    In this paper we show the design and implementation of a system to capture and analysis of dynamic speckle. The device consists of a USB camera, an isolated system lights for imaging, a laser pointer 633 nm 10 mw as coherent light source, a diffuser and a laptop for processing video. The equipment enables the acquisition and storage of video, also calculated of different descriptors of statistical analysis (vector global accumulation of activity, activity matrix accumulation, cross-correlation vector, autocorrelation coefficient, matrix Fujji etc.). The equipment is designed so that it can be taken directly to the site where the sample for biological study and is currently being used in research projects within the group.

  4. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  5. Towards a Platform for Image Acquisition and Processing on RASTA

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele

    2013-08-01

    This paper presents the architecture of a platform for image acquisition and processing based on commercial hardware and space qualified hardware. The aim is to extend the Reference Architecture Test-bed for Avionics (RASTA) system in order to obtain a Test-bed that allows testing different hardware and software solutions in the field of image acquisition and processing. The platform will allow the integration of space qualified hardware and Commercial Off The Shelf (COTS) hardware in order to test different architectural configurations. The first implementation is being performed on a low cost commercial board and on the GR712RC board based on the Dual Core Leon3 fault tolerant processor. The platform will include an actuation module with the aim of implementing a complete pipeline from image acquisition to actuation, making possible the simulation of a real case scenario involving acquisition and actuation.

  6. ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS

    EPA Science Inventory

    R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...

  7. Auditory Processing Disorders: Acquisition and Treatment

    ERIC Educational Resources Information Center

    Moore, David R.

    2007-01-01

    Auditory processing disorder (APD) describes a mixed and poorly understood listening problem characterised by poor speech perception, especially in challenging environments. APD may include an inherited component, and this may be major, but studies reviewed here of children with long-term otitis media with effusion (OME) provide strong evidence…

  8. 75 FR 62069 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... Federal Acquisition Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General... criteria that an agency must address in a waiver request and a waiver consultation process regarding... Operations in Sudan and Imports from Burma, in the Federal Register at 74 FR 40463 on August 11,...

  9. Reading Acquisition Enhances an Early Visual Process of Contour Integration

    ERIC Educational Resources Information Center

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched…

  10. Low Cost Coherent Doppler Lidar Data Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce W.; Koch, Grady J.

    2003-01-01

    The work described in this paper details the development of a low-cost, short-development time data acquisition and processing system for a coherent Doppler lidar. This was done using common laboratory equipment and a small software investment. This system provides near real-time wind profile measurements. Coding flexibility created a very useful test bed for new techniques.

  11. Sample Acquisition and Caching Architectures for the Mars 2020 Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Chu, P.; Paulsen, G.; Davis, K.

    2014-06-01

    The goal of the Mars 2020 mission is to acquire 37 samples for future return. We present technologies to capture cores and regolith, to abrade rocks, and to inspect cores prior to caching. We also present three caching architectures.

  12. Earth contamination free sample acquisition from an Earth Contaminated Spacecraft

    NASA Technical Reports Server (NTRS)

    Dolgin, B.; Bickler, D.; Carson, J.; Chung, S.; Quicksall, J.; Troy, R.; Yarbrough, C.

    2000-01-01

    The paper describes the first step in the feasibility demonstration of a novel low cost Mars Sample Return Transfer Sequence (STS) that does not require cleaning and sterilization of the entire spacecraft. The proposed STS relies on ability to collect (and in the future deliver to Earth) Earth-contamination-free samples from a spacecraft that was cleaned only to the levels achieved on the Pathfinder.

  13. Thermal mapping and trends of Mars analog materials in sample acquisition operations using experimentation and models

    NASA Astrophysics Data System (ADS)

    Szwarc, Timothy; Hubbard, Scott

    2014-09-01

    The effects of atmosphere, ambient temperature, and geologic material were studied experimentally and using a computer model to predict the heating undergone by Mars rocks during rover sampling operations. Tests were performed on five well-characterized and/or Mars analog materials: Indiana limestone, Saddleback basalt, kaolinite, travertine, and water ice. Eighteen tests were conducted to 55 mm depth using a Mars Sample Return prototype coring drill, with each sample containing six thermal sensors. A thermal simulation was written to predict the complete thermal profile within each sample during coring and this model was shown to be capable of predicting temperature increases with an average error of about 7%. This model may be used to schedule power levels and periods of rest during actual sample acquisition processes to avoid damaging samples or freezing the bit into icy formations. Maximum rock temperature increase is found to be modeled by a power law incorporating rock and operational parameters. Energy transmission efficiency in coring is found to increase linearly with rock hardness and decrease by 31% at Mars pressure.

  14. SALTSTONE PROCESSING FACILITY TRANSFER SAMPLE

    SciTech Connect

    Cozzi, A.; Reigel, M.

    2010-08-04

    On May 19, 2010, the Saltstone Production Facility inadvertently transferred 1800 gallons of untreated waste from the salt feed tank to Vault 4. During shut down, approximately 70 gallons of the material was left in the Saltstone hopper. A sample of the slurry in the hopper was sent to Savannah River National Laboratory (SRNL) to analyze the density, pH and the eight Resource Conservation and Recovery Act (RCRA) metals. The sample was hazardous for chromium, mercury and pH. The sample received from the Saltstone hopper was analyzed visually while obtaining sample aliquots and while the sample was allowed to settle. It was observed that the sample contains solids that settle in approximately 20 minutes (Figure 3-1). There is a floating layer on top of the supernate during settling and disperses when the sample is agitated (Figure 3-2). The untreated waste inadvertently transferred from the SFT to Vault 4 was toxic for chromium and mercury. In addition, the pH of the sample is at the regulatory limit. Visually inspecting the sample indicates solids present in the sample.

  15. Understanding the knowledge acquisition process about Earth and Space concepts

    NASA Astrophysics Data System (ADS)

    Frappart, Soren

    There exist two main theoretical views concerning the knowledge acquisition process in science. Those views are still in debate in the literature. On the one hand, knowledge is considered to be organized into coherent wholes (mental models). On the other hand knowledge is described as fragmented sets with no link between the fragments. Mental models have a predictive and explicative power and are constrained by universal presuppositions. They follow a universal gradual development in three steps from initial, synthetic to scientific models. On the contrary, the fragments are not organised and development is seen as a situated process where cultural transmission plays a fundamental role. After a presentation of those two theoretical positions, we will illustrate them with examples of studies related to the Earth Shape and gravity performed in different cultural contexts in order to enhance both the differences and the invariant cultural elements. We will show how those problematic are important to take into account and to question for space concepts, like gravity, orbits, weightlessness for instance. Indeed capturing the processes of acquisition and development of knowledge concerning specific space concepts can give us important information to develop relevant and adapted strategies for instruction. If the process of knowledge acquisition for Space concepts is fragmented then we have to think of how we could identify those fragments and help the learner organise links between them. If the knowledge is organised into coherent mental models, we have to think of how to destabilize a non relevant model and to prevent from the development of initial and synthetic models. Moreover the question of what is universal versus what is culture dependant in this acquisition process need to be explored. We will also present some main misconceptions that appeared about Space concepts. Indeed, additionally to the previous theoretical consideration, the collection and awareness of

  16. Multi-Channel Data Acquisition System for Nuclear Pulse Processing

    SciTech Connect

    Myjak, Mitchell J.; Ma, Ding; Robinson, Dirk J.; La Rue, George S.

    2009-11-13

    We have developed a compact, inexpensive electronics package that can digitize pulse-mode or current-mode data from 32 detector outputs in parallel. The electronics package consists of two circuit boards: a custom acquisition board and an off-the-shelf processing board. The acquisition board features a custom-designed integrated circuit that contains an array of charge-to-pulse-width converters. The processing board contains a field programmable gate array that digitizes the pulse widths, performs event discrimination, constructs energy histograms, and executes any user-defined software. Together, the two boards cost around $1000. The module can transfer data to a computer or operate entirely as a standalone system. The design achieves 0.20% nonlinearity and 0.18% FWHM precision at full scale. However, the overall performance could be improved with some modifications to the integrated circuit.

  17. Autonomous site selection and instrument positioning for sample acquisition

    NASA Astrophysics Data System (ADS)

    Shaw, A.; Barnes, D.; Pugh, S.

    The European Space Agency Aurora Exploration Program aims to establish a European long-term programme for the exploration of Space, culminating in a human mission to space in the 2030 timeframe. Two flagship missions, namely Mars Sample Return and ExoMars, have been proposed as recognised steps along the way. The Exomars Rover is the first of these flagship missions and includes a rover carrying the Pasteur Payload, a mobile exobiology instrumentation package, and the Beagle 2 arm. The primary objective is the search for evidence of past or present life on mars, but the payload will also study the evolution of the planet and the atmosphere, look for evidence of seismological activity and survey the environment in preparation for future missions. The operation of rovers in unknown environments is complicated, and requires large resources not only on the planet but also in ground based operations. Currently, this can be very labour intensive, and costly, if large teams of scientists and engineers are required to assess mission progress, plan mission scenarios, and construct a sequence of events or goals for uplink. Furthermore, the constraints in communication imposed by the time delay involved over such large distances, and line-of-sight required, make autonomy paramount to mission success, affording the ability to operate in the event of communications outages and be opportunistic with respect to scientific discovery. As part of this drive to reduce mission costs and increase autonomy the Space Robotics group at the University of Wales, Aberystwyth is researching methods of autonomous site selection and instrument positioning, directly applicable to the ExoMars mission. The site selection technique used builds on the geometric reasoning algorithms used previously for localisation and navigation [Shaw 03]. It is proposed that a digital elevation model (DEM) of the local surface, generated during traverse and without interaction from ground based operators, can be

  18. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion. PMID:25677085

  19. Mosaic acquisition and processing for optical-resolution photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Shao, Peng; Shi, Wei; Chee, Ryan K. W.; Zemp, Roger J.

    2012-08-01

    In optical-resolution photo-acoustic microscopy (OR-PAM), data acquisition time is limited by both laser pulse repetition rate (PRR) and scanning speed. Optical-scanning offers high speed, but limited, field of view determined by ultrasound transducer sensitivity. In this paper, we propose a hybrid optical and mechanical-scanning OR-PAM system with mosaic data acquisition and processing. The system employs fast-scanning mirrors and a diode-pumped, nanosecond-pulsed, Ytterbium-doped, 532-nm fiber laser with PRR up to 600 kHz. Data from a sequence of image mosaic patches is acquired systematically, at predetermined mechanical scanning locations, with optical scanning. After all imaging locations are covered, a large panoramic scene is generated by stitching the mosaic patches together. Our proposed system is proven to be at least 20 times faster than previous reported OR-PAM systems.

  20. Sampling and sample processing in pesticide residue analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Proper sampling and sample processing in pesticide residue analysis of food and soil has always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agroc...

  1. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  2. Design and implementation of a compressive infrared sampling for motion acquisition

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Liu, Jun

    2014-12-01

    This article proposes a compressive infrared sampling method in pursuit of the acquisition and processing of human motion simultaneously. The spatial-temporal changes caused by the movements of the human body are intrinsical clues for determining the semantics of motion, while the movements of short-term changes can be considered as a sparse distribution compared with the sensing region. Several pyroelectric infrared (PIR) sensors with pseudo-random-coded Fresnel lenses are introduced to acquire and compress motion information synchronously. The compressive PIR array has the ability to record the changes in the thermal radiation field caused by movements and encode the motion information into low-dimensional sensory outputs directly. Therefore, the problem of recognizing a high-dimensional image sequence is cast as a low-dimensional sequence recognition process. A database involving various kinds of motion played by several people is built. Hausdorff distance-based template matching is employed for motion recognition. Experimental studies are conducted to validate the proposed method.

  3. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, John E.; Dinsmore, Stanley R.; Chandler, Edward W.

    1986-01-01

    A four-port disc valve for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of .alpha. silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions.

  4. Impact of radial and angular sampling on multiple shells acquisition in diffusion MRI.

    PubMed

    Merlet, Sylvain; Caruyer, Emmanuel; Deriche, Rachid

    2011-01-01

    We evaluate the impact of radial and angular sampling on multiple shells (MS) acquisition in diffusion MRI. The validation of our results is based on a new and efficient method to accurately reconstruct the Ensemble Average Propagator (EAP) in term of the Spherical Polar Fourier (SPF) basis from very few diffusion weighted magnetic resonance images (DW-MRI). This approach nicely exploits the duality between SPF and a closely related basis in which one can respectively represent the EAP and the diffusion signal using the same coefficients. We efficiently combine this relation to the recent acquisition and reconstruction technique called Compressed Sensing (CS). Based on results of multi-tensors models reconstruction, we show how to construct a robust acquisition scheme for both neural fibre orientation detection and attenuation signal/EAP reconstruction. PMID:21995020

  5. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  6. Processes of language acquisition in children with autism: evidence from preferential looking.

    PubMed

    Swensen, Lauren D; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency to map novel words onto objects rather than actions. Spontaneous speech samples were also collected. Results demonstrated significant comprehension of word order in both groups well before production. Moreover, children in both groups consistently showed the noun bias. Comprehension preceding production and the noun bias appear to be robust processes of language acquisition, observable in both typical and language-impaired populations. PMID:17381789

  7. An extended-source spatial acquisition process based on maximum likelihood criterion for planetary optical communications

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee

    1992-01-01

    This paper describes an extended-source spatial acquisition process based on the maximum likelihood criterion for interplanetary optical communications. The objective is to use the sun-lit Earth image as a receiver beacon and point the transmitter laser to the Earth-based receiver to establish a communication path. The process assumes the existence of a reference image. The uncertainties between the reference image and the received image are modeled as additive white Gaussian disturbances. It has been shown that the optimal spatial acquisition requires solving two nonlinear equations to estimate the coordinates of the transceiver from the received camera image in the transformed domain. The optimal solution can be obtained iteratively by solving two linear equations. Numerical results using a sample sun-lit Earth as a reference image demonstrate that sub-pixel resolutions can be achieved in a high disturbance environment. Spatial resolution is quantified by Cramer-Rao lower bounds.

  8. A prototype data acquisition and processing system for Schumann resonance measurements

    NASA Astrophysics Data System (ADS)

    Tatsis, Giorgos; Votis, Constantinos; Christofilakis, Vasilis; Kostarakis, Panos; Tritakis, Vasilis; Repapis, Christos

    2015-12-01

    In this paper, a cost-effective prototype data acquisition system specifically designed for Schumann resonance measurements and an adequate signal processing method are described in detail. The implemented system captures the magnetic component of the Schumann resonance signal, using a magnetic antenna, at much higher sampling rates than the Nyquist rate for efficient signal improvement. In order to obtain the characteristics of the individual resonances of the SR spectrum a new and efficient software was developed. The processing techniques used in this software are analyzed thoroughly in the following. Evaluation of system's performance and operation is realized using preliminary measurements taken in the region of Northwest Greece.

  9. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  10. Acquisition of matching-to-sample performance in rats using visual stimuli on nose keys.

    PubMed

    Iversen, I H

    1993-05-01

    Steady and blinking white lights were projected on three nose keys arranged horizontally on one wall. The procedure was a conditional discrimination with a sample stimulus presented on the middle key and comparison stimuli on the side keys. Three rats acquired simultaneous "identity matching." Accuracy reached 80% in about 25 sessions and 90% or higher after about 50 sessions. Acquisition progressed through several stages of repeated errors, alteration between comparison keys from trial to trial, preference of specific keys or stimuli, and a gradual lengthening of strings of consecutive trials with correct responses. An analysis of the acquisition curves for individual trial configurations indicated that the matching-to-sample performance possibly consisted of separate discriminations. PMID:8315365

  11. An efficient scheme for sampling fast dynamics at a low average data acquisition rate

    NASA Astrophysics Data System (ADS)

    Philippe, A.; Aime, S.; Roger, V.; Jelinek, R.; Prévot, G.; Berthier, L.; Cipelletti, L.

    2016-02-01

    We introduce a temporal scheme for data sampling, based on a variable delay between two successive data acquisitions. The scheme is designed so as to reduce the average data flow rate, while still retaining the information on the data evolution on fast time scales. The practical implementation of the scheme is discussed and demonstrated in light scattering and microscopy experiments that probe the dynamics of colloidal suspensions using CMOS or CCD cameras as detectors.

  12. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1984-08-16

    This is a patent for a disc-type, four-port sampling valve for service with erosive high temperature process streams. Inserts and liners of ..cap alpha..-silicon carbide respectively, in the faceplates and in the sampling cavities, limit erosion while providing lubricity for a smooth and precise operation. 1 fig.

  13. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1986-01-07

    A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.

  14. Sample Acquisition and Analytical Chemistry Challenges to Verifying Compliance to Aviators Breathing Oxygen (ABO) Purity Specification

    NASA Technical Reports Server (NTRS)

    Graf, John

    2015-01-01

    NASA has been developing and testing two different types of oxygen separation systems. One type of oxygen separation system uses pressure swing technology, the other type uses a solid electrolyte electrochemical oxygen separation cell. Both development systems have been subjected to long term testing, and performance testing under a variety of environmental and operational conditions. Testing these two systems revealed that measuring the product purity of oxygen, and determining if an oxygen separation device meets Aviator's Breathing Oxygen (ABO) specifications is a subtle and sometimes difficult analytical chemistry job. Verifying product purity of cryogenically produced oxygen presents a different set of analytical chemistry challenges. This presentation will describe some of the sample acquisition and analytical chemistry challenges presented by verifying oxygen produced by an oxygen separator - and verifying oxygen produced by cryogenic separation processes. The primary contaminant that causes gas samples to fail to meet ABO requirements is water. The maximum amount of water vapor allowed is 7 ppmv. The principal challenge of verifying oxygen produced by an oxygen separator is that it is produced relatively slowly, and at comparatively low temperatures. A short term failure that occurs for just a few minutes in the course of a 1 week run could cause an entire tank to be rejected. Continuous monitoring of oxygen purity and water vapor could identify problems as soon as they occur. Long term oxygen separator tests were instrumented with an oxygen analyzer and with an hygrometer: a GE Moisture Monitor Series 35. This hygrometer uses an aluminum oxide sensor. The user's manual does not report this, but long term exposure to pure oxygen causes the aluminum oxide sensor head to bias dry. Oxygen product that exceeded the 7 ppm specification was improperly accepted, because the sensor had biased. The bias is permanent - exposure to air does not cause the sensor to

  15. The Logical Syntax of Number Words: Theory, Acquisition and Processing

    ERIC Educational Resources Information Center

    Musolino, Julien

    2009-01-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. "Cognition 93", 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar…

  16. Discrete Signal Processing on Graphs: Sampling Theory

    NASA Astrophysics Data System (ADS)

    Chen, Siheng; Varma, Rohan; Sandryhaila, Aliaksei; Kovacevic, Jelena

    2015-12-01

    We propose a sampling theory for signals that are supported on either directed or undirected graphs. The theory follows the same paradigm as classical sampling theory. We show that perfect recovery is possible for graph signals bandlimited under the graph Fourier transform. The sampled signal coefficients form a new graph signal, whose corresponding graph structure preserves the first-order difference of the original graph signal. For general graphs, an optimal sampling operator based on experimentally designed sampling is proposed to guarantee perfect recovery and robustness to noise; for graphs whose graph Fourier transforms are frames with maximal robustness to erasures as well as for Erd\\H{o}s-R\\'enyi graphs, random sampling leads to perfect recovery with high probability. We further establish the connection to the sampling theory of finite discrete-time signal processing and previous work on signal recovery on graphs. To handle full-band graph signals, we propose a graph filter bank based on sampling theory on graphs. Finally, we apply the proposed sampling theory to semi-supervised classification on online blogs and digit images, where we achieve similar or better performance with fewer labeled samples compared to previous work.

  17. Chapter A5. Processing of Water Samples

    USGS Publications Warehouse

    Wilde, Franceska D., (Edited By); Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.

    1999-01-01

    The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.

  18. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND... not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  19. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND... not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  20. Proposed Science Requirements and Acquisition Priorities for the First Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Agee, C. B.; Bogard, D. D.; Draper, D. S.; Jones, J. H.; Meyer , C., Jr.; Mittlefehldt, D. W.

    2000-01-01

    A sample return mission is an important next step in the exploration of Mars. The first sample return should come early in the program time-line because the science derived from earth-based analyses of samples provides crucial "ground truth" needed for further exploration planning, enhancement of remote measurements, and achieving science goals and objectives that include: (1) the search for environments that may support life and any indicators of the past or present existence of life, (2) understanding the history of water and climate on Mars, (3) understanding the evolution of Mars as a planet. Returned samples from Mars will have unique value because they can be studied by scientists worldwide using the most powerful analytical instruments available. Furthermore, returned Mars samples can be preserved for studies by future generations of scientists using new techniques and addressing new issues in Mars science. To ensure a high likelihood of success, the first sample return strategy should be simple and focused. We outline a fundamental set of sample requirements and acquisition priorities for Mars sample return.

  1. Sampling for advanced overlay process control

    NASA Astrophysics Data System (ADS)

    Choi, DongSub; Izikson, Pavel; Sutherland, Doug; Sherman, Kara; Manka, Jim; Robinson, John C.

    2008-03-01

    Overlay metrology and control have been critical for successful advanced microlithography for many years, and are taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and processing becomes more complex, however, it is necessary to consider higher order modeled terms for control, fault detection, and disposition. This in turn, requires a higher level of sampling. Due to throughput concerns, however, careful consideration is needed to establish a base-line sampling, and higher levels of sampling can be considered on an exception-basis based on automated trigger mechanisms. The goal is improved scanner control and lithographic cost of ownership. This study addresses tools for establishing baseline sampling as well as motivation and initial results for dynamic sampling for application to higher order modeling.

  2. Sampling for advanced overlay process control

    NASA Astrophysics Data System (ADS)

    Kato, Cindy; Kurita, Hiroyuki; Izikson, Pavel; Robinson, John C.

    2009-03-01

    Overlay metrology and control have been critical for successful advanced microlithography for many years, and are taking on an even more important role as time goes on. Due to throughput constraints it is necessary to sample only a small subset of overlay metrology marks, and typical sample plans are static over time. Standard production monitoring and control involves measuring sufficient samples to calculate up to 6 linear correctables. As design rules shrink and processing becomes more complex, however, it is necessary to consider higher order models with additional degrees of freedom for control, fault detection, and disposition. This in turn, requires a higher level of sampling and a careful consideration of flyer removal. Due to throughput concerns, however, careful consideration is needed to establish a baseline sampling plan using rigorous statistical methods. This study focuses on establishing a 3x nm node immersion lithography production-worthy sampling plan for 3rd order modeling, verification of the accuracy, and proof of robustness of the sampling. In addition we discuss motivation for dynamic sampling for application to higher order modeling.

  3. Reverse engineering of the multiple launch rocket system. Human factors, manpower, personnel, and training in the weapons system acquisition process

    NASA Astrophysics Data System (ADS)

    Arabian, J. M.; Hartel, C. R.; Kaplan, J. D.; Marcus, A.; Promisel, D. M.

    1984-06-01

    In a briefing format, this report on the Multiple Launch Rocket System summarizes an examination of human factors, manpower, personnel and training (HMPT) issues during the systems acquisition process. The report is one of four reverse engineering studies prepared at the request of Gen. M. R. Thurman, Army Vice Chief of Staff. The four systems were studied as a representative sample of Army weapons systems. They serve as the basis for drawing conclusions about aspects of the weapons system acquisition process which most affect HMPT considerations. A synthesis of the four system studies appears in the final report of the Reverse Engineering Task Force U.S. Army Research Institute.

  4. Mars sampling strategy and aeolian processes

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald

    1988-01-01

    It is critical that the geological context of planetary samples (both in situ analyses and return samples) be well known and documented. Apollo experience showed that this goal is often difficult to achieve even for a planet on which surficial processes are relatively restricted. On Mars, the variety of present and past surface processes is much greater than on the Moon and establishing the geological context of samples will be much more difficult. In addition to impact hardening, Mars has been modified by running water, periglacial activity, wind, and other processes, all of which have the potential for profoundly affecting the geological integrity of potential samples. Aeolian, or wind, processes are ubiquitous on Mars. In the absence of liquid water on the surface, aeolian activity dominates the present surface as documented by frequent dust storms (both local and global), landforms such as dunes, and variable features, i.e., albedo patterns which change their size, shape, and position with time in response to the wind.

  5. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, J.R.

    1997-02-11

    A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.

  6. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, John R.

    1997-01-01

    A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.

  7. Surface studies of plasma processed Nb samples

    SciTech Connect

    Tyagi, Puneet V; Doleans, Marc; Hannah, Brian S; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  8. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  9. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... Regulation Supplement; DoD Voucher Processing AGENCY: Defense Acquisition Regulations System, Department of Defense (DoD). ACTION: Proposed rule. SUMMARY: DoD is proposing to amend the Defense Federal Acquisition Regulation Supplement (DFARS) to update DoD's voucher processing procedures and better accommodate the use...

  10. Respirable coal mine dust sample processing

    SciTech Connect

    Raymond, L.D.; Tomb, T.F.; Parobeck, P.S.

    1987-01-01

    The Federal Coal Mine Health and Safety Act of 1969 established mandatory dust standards for coal mines. Regulatory requirements for complying with the provisions of the Act were prescribed in Title 30, Code of Federal Regulations, Parts 70 and 71, which were published in the Federal Register on April 3, 1970, and March 28, 1972, respectively. These standard and sampling requirements of coal mine operators, along with a description of the laboratory which was established to process respirable coal mine dust samples collected in accordance with these requirements, were published in MESA Informational Report (MESA, the acronym for the Mining Enforcement and Safety Administration, was changed to MSHA, the acronym for the Mine Safety and Health Administration, in 1977). These standards and regulatory requirements continued under the Federal Mine Safety and Health Act of 1977 until November 1980, when major regulatory revisions were made in the operator's dust sampling program. This paper describes the changes in the respirable coal mine dust sampling program and the equipment and procedures used by MSHA to process respirable coal mine dust samples collected in accordance with regulatory requirements. 10 figs., 1 tab.

  11. Acquisition and Processing of Multi-source Technique Offshore with Different Types of Source

    NASA Astrophysics Data System (ADS)

    Li, L.; Tong, S.; Zhou, H. W.

    2015-12-01

    Multi-source blended offshore seismic acquisition has been developed in recent years. The technology aims to improve the efficiency of acquisition or enhance the image quality through the dense spatial sampling. Previous methods usually use several source of the same type, we propose applying onshore sources with different central frequencies to image multiscale target layers at different depths. Low frequency seismic source is used to image the deep structure but has low resolution at shallow depth, which can be compensated by high frequency. By combing the low and high frequency imaging together, we obtain high resolution profiles on both shallow and deep. Considering all of above, we implemented a 2-D cruise using 300Hz and 2000Hz central frequency spark source whcich are randomly shooted with certain delay time. In this process we separate blended data by denoising methods, including middle filter and curvelet transform, and then match prestack data to obtain final profiles. Median filter can restrain impulse noise and protect the edges while curvelet transform has multi-scale characteristics and powerful sparse expression ability. The iterative noise elimination can produce good results. Prestack matching filter is important when integrate wavelet of two different spark sources because of their different characteristics, making data accordant for reflecting time, amplitude, frequency and phase. By comparing with profiles used either single type of source, the image of blended acquisition shows higher resolution at shallow depth and results in more information in deep locations.

  12. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  13. Developmental Stages in Receptive Grammar Acquisition: A Processability Theory Account

    ERIC Educational Resources Information Center

    Buyl, Aafke; Housen, Alex

    2015-01-01

    This study takes a new look at the topic of developmental stages in the second language (L2) acquisition of morphosyntax by analysing receptive learner data, a language mode that has hitherto received very little attention within this strand of research (for a recent and rare study, see Spinner, 2013). Looking at both the receptive and productive…

  14. DEVELOPMENT OF MARKETABLE TYPING SKILL--SENSORY PROCESSES UNDERLYING ACQUISITION.

    ERIC Educational Resources Information Center

    WEST, LEONARD J.

    THE PROJECT ATTEMPTED TO PROVIDE FURTHER DATA ON THE DOMINANT HYPOTHESIS ABOUT THE SENSORY MECHANISMS UNDERLYING SKILL ACQUISITION IN TYPEWRITING. IN SO DOING, IT PROPOSED TO FURNISH A BASIS FOR IMPORTANT CORRECTIVES TO SUCH CONVENTIONAL INSTRUCTIONAL PROCEDURES AS TOUCH TYPING. SPECIFICALLY, THE HYPOTHESIS HAS BEEN THAT KINESTHESIS IS NOT…

  15. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    NASA Astrophysics Data System (ADS)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.

  16. Oral processing of two milk chocolate samples.

    PubMed

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina

    2013-02-26

    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  17. Possible overlapping time frames of acquisition and consolidation phases in object memory processes: a pharmacological approach.

    PubMed

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when administered within 2 min after the acquisition trial. Likewise, both PDE5-I and PDE4-I reversed the scopolamine deficit model when administered within 2 min after the learning trial. PDE5-I was effective up to 45 min after the acquisition trial and PDE4-I was effective when administered between 3 and 5.5 h after the acquisition trial. Taken together, our study suggests that acetylcholine, cGMP, and cAMP are all involved in acquisition processes and that cGMP and cAMP are also involved in early and late consolidation processes, respectively. Most important, these pharmacological studies suggest that acquisition processes continue for some time after the learning trial where they share a short common time frame with early consolidation processes. Additional brain concentration measurements of the drugs suggest that these acquisition processes can continue up to 4-6 min after learning. PMID:26670184

  18. Processing and transport of environmental virus samples.

    PubMed Central

    Dahling, D R; Wright, B A

    1984-01-01

    Poliovirus-seeded tap water, conditioned with MgCl2 and passed through virus-adsorbing filters, gave better poliovirus recovery than water identically treated but conditioned with AlCl3. Elution of several filter types with beef extract yielded higher recoveries than did elution with glycine. Seeded samples filtered through various filters and stored showed considerable virus loss in 2 days when stored at 4 degrees C, whereas those stored at -70 degrees C gave stable virus recovery up to 4 days. Additionally, the use of antifoam during the elution process reduced foaming and increased virus recovery by 28%. PMID:6331312

  19. Software interface and data acquisition package for the LakeShore cryogenics vibrating sample magnetometer

    SciTech Connect

    O`Dell, B.H.

    1995-11-01

    A software package was developed to replace the software provided by LakeShore for their model 7300 vibrating sample magnetometer (VSM). Several problems with the original software`s functionality caused this group to seek a new software package. The new software utilizes many features that were unsupported in the LakeShore software, including a more functional step mode, point averaging mode, vector moment measurements, and calibration for field offset. The developed software interfaces the VSM through a menu driven graphical user interface, and bypasses the VSM`s on board processor leaving control of the VSM up to the software. The source code for this software is readily available to any one. By having the source, the experimentalist has full control of data acquisition and can add routines specific to their experiment.

  20. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  1. On Accuracy of Knowledge Acquisition for Decision Making Processes Acquiring Subjective Information on the Internet

    NASA Astrophysics Data System (ADS)

    Fujimoto, Kazunori; Yamamoto, Yutaka

    This paper presents a mathematical model for decision making processes where the knowledge for the decision is constructed automatically from subjective information on the Internet. This mathematical model enables us to know the required degree of accuracy of knowledge acquisition for constructing decision support systems using two technologies: automated knowledge acquisition from information on the Internet and automated reasoning about the acquired knowledge. The model consists of three elements: knowledge source, which is a set of subjective information on the Internet, knowledge acquisition, which acquires knowledge base within a computer from the knowledge source, and decision rule, which chooses a set of alternatives by using the knowledge base. One of the important features of this model is that the model contains not only decision making processes but also knowledge acquisition processes. This feature enables to analyze the decision processes with the sufficiency of knowledge sources and the accuracy of knowledge acquisition methods. Based on the model, decision processes by which the knowledge source and the knowledge base lead to the same choices are given and the required degree of accuracy of knowledge acquisition is quantified as required accuracy value. In order to show the way to utilize the value for designing the decision support systems, the value is calculated by using some examples of knowledge sources and decision rules. This paper also describes the computational complexity of the required accuracy value calculation and shows a computation principle for reducing the complexity to the polynomial order of the size of knowledge sources.

  2. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models. PMID:26150211

  3. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  4. 76 FR 68037 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General Services Administration (GSA... that conducts restricted business operations in Sudan. The rule also describes the consultation process... Federal Register at 75 FR 62069 on October 7, 2010, to revise FAR 25.702, Prohibition on contracting...

  5. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  6. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  7. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  8. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  9. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  10. The acquisition process of musical tonal schema: implications from connectionist modeling.

    PubMed

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-Ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical 'scale' sensitivity early and 'harmony' sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and PMID:26441725

  11. Robust Read Channel System Directly Processing Asynchronous Sampling Data

    NASA Astrophysics Data System (ADS)

    Yamamoto, Akira; Mouri, Hiroki; Yamamoto, Takashi

    2006-02-01

    In this study, we describe a robust read channel employing a novel timing recovery system and a unique Viterbi detector which extracts channel timing and channel data directly from asynchronous sampling data. The timing recovery system in the proposed read channel has feed-forward architecture and consists entirely of digital circuits. Thus, it enables robust timing recovery at high-speed and has no performance deterioration caused by variations in analog circuits. The Viterbi detector not only detects maximum-likelihood data using a reference level generator, but also transforms asynchronous data into pseudosynchronous data using two clocks, such as an asynchronous clock generated by a frequency synthesizer and a pseudosynchronous clock generated by a timing detector. The proposed read channel has achieved a constant and fast frequency acquisition time against initial frequency error and has improved its bit error rate performance. This robust read channel system can be used for high-speed signal processing and LSIs using nanometer-scale semiconductor processes.

  12. Sensor Data Acquisition and Processing Parameters for Human Activity Classification

    PubMed Central

    Bersch, Sebastian D.; Azzi, Djamel; Khusainov, Rinat; Achumba, Ifeyinwa E.; Ries, Jana

    2014-01-01

    It is known that parameter selection for data sampling frequency and segmentation techniques (including different methods and window sizes) has an impact on the classification accuracy. For Ambient Assisted Living (AAL), no clear information to select these parameters exists, hence a wide variety and inconsistency across today's literature is observed. This paper presents the empirical investigation of different data sampling rates, segmentation techniques and segmentation window sizes and their effect on the accuracy of Activity of Daily Living (ADL) event classification and computational load for two different accelerometer sensor datasets. The study is conducted using an ANalysis Of VAriance (ANOVA) based on 32 different window sizes, three different segmentation algorithm (with and without overlap, totaling in six different parameters) and six sampling frequencies for nine common classification algorithms. The classification accuracy is based on a feature vector consisting of Root Mean Square (RMS), Mean, Signal Magnitude Area (SMA), Signal Vector Magnitude (here SMV), Energy, Entropy, FFTPeak, Standard Deviation (STD). The results are presented alongside recommendations for the parameter selection on the basis of the best performing parameter combinations that are identified by means of the corresponding Pareto curve. PMID:24599189

  13. SAPS—An automated and networked seismological acquisition and processing system

    NASA Astrophysics Data System (ADS)

    Oncescu, Mihnea Corneliu; Rizescu, Mihaela; Bonjer, Klaus-Peter

    1996-02-01

    A PC-based digital data acquisition and processing system was developed and implemented on two PCs linked by a peer-to-peer LAN. Sixteen channels are sampled with a rate of 200 Hz. The acquisition is performed continuously in sequenced files on one PC using the IASPEI-released XRTP software. The length of the elementary files is adjustable; we used 90 sec in this application. The second PC runs a program to organize automatically the following processing steps: (i) moving the raw data from the first to the second PC; (ii) filtering the data for running a 'Rex Allen'-like picker for P waves on each elementary file; (iii) concatenating three consecutive elementary files if the detection criteria are fulfilled; (v) decoding a fast time code (Lennartz-style); (v) discriminating between local and teleseismic events; (vi) plane-wave method location and mb determination for teleseisms; (vii) picking S waves, determining coda duration and locating local events; (viii) conversion of PC-SUDS into GSE format and 'feeding' a Data Request Manager with phases, locations and waveforms; (ix) sending phases and location, via e-mail, minutes after detection, and a 'health status' every hour, to the system manager; (x) plotting the raw data, the picks and printing the location results; and (xi) archiving data and results locally and on a remote workstation. The system has been running since April 1994 with data from the telemetered network of the Upper Rhinegraben. Being modular, the system can be extended and upgraded easily. Loss of data is avoided by using large hard disks as temporary data buffers and file mirroring on different hard disk drives.

  14. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P. ); Elliott, A. )

    1992-01-01

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  15. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P.; Elliott, A.

    1992-12-31

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  16. Isolating Intrinsic Processing Disorders from Second Language Acquisition.

    ERIC Educational Resources Information Center

    Lock, Robin H.; Layton, Carol A.

    2002-01-01

    Evaluation of the validity of the Learning Disabilities Diagnostic Inventory with limited-English-proficient (LEP) students in grades 2-7 found that nondisabled LEP students were over-identified as having intrinsic processing deficits. Examination of individual student protocols highlighted the need to train teacher-raters in language acquisition…

  17. Accelerating COTS Middleware Acquisition: The i-Mate Process

    SciTech Connect

    Liu, Anna; Gorton, Ian

    2003-03-05

    Most major organizations now use some commercial-off-the-shelf middleware components to run their businesses. Key drivers behind this growth include ever-increasing Internet usage and the ongoing need to integrate heterogeneous legacy systems to streamline business processes. As organizations do more business online, they need scalable, high-performance software infrastructures to handle transactions and provide access to core systems.

  18. Stable image acquisition for mobile image processing applications

    NASA Astrophysics Data System (ADS)

    Henning, Kai-Fabian; Fritze, Alexander; Gillich, Eugen; Mönks, Uwe; Lohweg, Volker

    2015-02-01

    Today, mobile devices (smartphones, tablets, etc.) are widespread and of high importance for their users. Their performance as well as versatility increases over time. This leads to the opportunity to use such devices for more specific tasks like image processing in an industrial context. For the analysis of images requirements like image quality (blur, illumination, etc.) as well as a defined relative position of the object to be inspected are crucial. Since mobile devices are handheld and used in constantly changing environments the challenge is to fulfill these requirements. We present an approach to overcome the obstacles and stabilize the image capturing process such that image analysis becomes significantly improved on mobile devices. Therefore, image processing methods are combined with sensor fusion concepts. The approach consists of three main parts. First, pose estimation methods are used to guide a user moving the device to a defined position. Second, the sensors data and the pose information are combined for relative motion estimation. Finally, the image capturing process is automated. It is triggered depending on the alignment of the device and the object as well as the image quality that can be achieved under consideration of motion and environmental effects.

  19. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  20. A dual process account of coarticulation in motor skill acquisition.

    PubMed

    Shah, Ashvin; Barto, Andrew G; Fagg, Andrew H

    2013-01-01

    Many tasks, such as typing a password, are decomposed into a sequence of subtasks that can be accomplished in many ways. Behavior that accomplishes subtasks in ways that are influenced by the overall task is often described as "skilled" and exhibits coarticulation. Many accounts of coarticulation use search methods that are informed by representations of objectives that define skilled. While they aid in describing the strategies the nervous system may follow, they are computationally complex and may be difficult to attribute to brain structures. Here, the authors present a biologically- inspired account whereby skilled behavior is developed through 2 simple processes: (a) a corrective process that ensures that each subtask is accomplished, but does not do so skillfully and (b) a reinforcement learning process that finds better movements using trial and error search that is not informed by representations of any objectives. We implement our account as a computational model controlling a simulated two-armed kinematic "robot" that must hit a sequence of goals with its hands. Behavior displays coarticulation in terms of which hand was chosen, how the corresponding arm was used, and how the other arm was used, suggesting that the account can participate in the development of skilled behavior. PMID:24116847

  1. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  2. [An image acquisition & processing system of the wireless endoscope based on DSP].

    PubMed

    Zhang, Jin-hua; Peng, Cheng-lin; Zhao, De-chun; Yang-Li

    2006-07-01

    This paper covers an image acquisition & processing system of the capsule-style endoscope. Images sent by the endoscope are compressed and encoded with the digital signal processor (DSP) saving data in HD into PC for analyzing and processing in the image browser workstation. PMID:17039927

  3. Data acquisition and online processing requirements for experimentation at the Superconducting Super Collider

    SciTech Connect

    Lankford, A.J.; Barsotti, E.; Gaines, I.

    1989-07-01

    Differences in scale between data acquisition and online processing requirements for detectors at the Superconducting Super Collider and systems for existing large detectors will require new architectures and technological advances in these systems. Emerging technologies will be employed for data transfer, processing, and recording. 9 refs., 3 figs.

  4. Population Processes Sampled at Random Times

    NASA Astrophysics Data System (ADS)

    Beghin, Luisa; Orsingher, Enzo

    2016-04-01

    In this paper we study the iterated birth process of which we examine the first-passage time distributions and the hitting probabilities. Furthermore, linear birth processes, linear and sublinear death processes at Poisson times are investigated. In particular, we study the hitting times in all cases and examine their long-range behavior. The time-changed population models considered here display upward (birth process) and downward jumps (death processes) of arbitrary size and, for this reason, can be adopted as adequate models in ecology, epidemics and finance situations, under stress conditions.

  5. Phases of learning: How skill acquisition impacts cognitive processing.

    PubMed

    Tenison, Caitlin; Fincham, Jon M; Anderson, John R

    2016-06-01

    This fMRI study examines the changes in participants' information processing as they repeatedly solve the same mathematical problem. We show that the majority of practice-related speedup is produced by discrete changes in cognitive processing. Because the points at which these changes take place vary from problem to problem, and the underlying information processing steps vary in duration, the existence of such discrete changes can be hard to detect. Using two converging approaches, we establish the existence of three learning phases. When solving a problem in one of these learning phases, participants can go through three cognitive stages: Encoding, Solving, and Responding. Each cognitive stage is associated with a unique brain signature. Using a bottom-up approach combining multi-voxel pattern analysis and hidden semi-Markov modeling, we identify the duration of that stage on any particular trial from participants brain activation patterns. For our top-down approach we developed an ACT-R model of these cognitive stages and simulated how they change over the course of learning. The Solving stage of the first learning phase is long and involves a sequence of arithmetic computations. Participants transition to the second learning phase when they can retrieve the answer, thereby drastically reducing the duration of the Solving stage. With continued practice, participants then transition to the third learning phase when they recognize the problem as a single unit and produce the answer as an automatic response. The duration of this third learning phase is dominated by the Responding stage. PMID:27018936

  6. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  7. Acquisition times of carrier tracking sampled data phase-locked loops

    NASA Technical Reports Server (NTRS)

    Aguirre, S.

    1986-01-01

    Phase acquisition times of type II and III loops typical of the Advanced Receiver are studied by computer simulations when the loops are disturbed by gaussian noise. Reliable estimates are obtained by running 5000 trials for each combination of loop signal-to-noise ratio (SNR) and frequency offset. The probabilities of acquisition are shown versus time from start of acquisition for various loop SNRs and frequency offsets. For frequency offsets smaller than one-fourth of the loop bandwidth and for loop SNRs of 10 dB and higher, the loops acquire with probability 0.99 within 2.5 B sub L for type II loops and within 7/B sub L for type III loops.

  8. The acquisition of integrated science process skills in a web-based learning environment

    NASA Astrophysics Data System (ADS)

    Saat, Rohaida Mohd.

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among grade 5 children. Data were gathered primarily from children's conversations and teacher-student conversations. Analysis of the data revealed that the children acquired the skill in three phases: from the phase of recognition to the phase of familiarization and finally to the phase of automation. Nevertheless, the acquisition of the skill only involved the acquisition of certain subskills of the skill of controlling variables. This progression could be influenced by the web-based instructional material that provided declarative knowledge, concrete visualization and opportunities for practise.

  9. A Future Vision of a Data Acquisition: Distributed Sensing, Processing, and Health Monitoring

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Solano, Wanda; Thurman, Charles; Schmalzel, John

    2000-01-01

    This paper presents a vision fo a highly enhanced data acquisition and health monitoring system at NASA Stennis Space Center (SSC) rocket engine test facility. This vision includes the use of advanced processing capabilities in conjunction with highly autonomous distributed sensing and intelligence, to monitor and evaluate the health of data in the context of it's associated process. This method is expected to significantly reduce data acquisitions costs and improve system reliability. A Universal Signal Conditioning Amplifier (USCA) based system, under development at Kennedy Space Center, is being evaluated for adaptation to the SSC testing infrastructure. Kennedy's USCA architecture offers many advantages including flexible and auto-configuring data acquisition with improved calibration and verifiability. Possible enhancements at SSC may include multiplexing the distributed USCAs to reduce per channel cost, and the use of IEEE-485 to Allen-Bradley Control Net Gateways for interfacing with the resident control systems.

  10. High throughput sample processing and automated scoring.

    PubMed

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R; Gutzkow, Kristine B

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  11. High throughput sample processing and automated scoring

    PubMed Central

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R.; Gutzkow, Kristine B.

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  12. Quantitative modal determination of geological samples based on X-ray multielemental map acquisition.

    PubMed

    Cossio, Roberto; Borghi, Alessandro; Ruffini, Raffaella

    2002-04-01

    Multielemental X-ray maps collected by a remote scanning system of the electron beam are processed by a dedicated software program performing accurate modal determination of geological samples. The classification of different mineral phases is based on elemental concentrations. The software program Petromod loads the maps into a database and computes a matrix consisting of numerical values proportional to the elemental concentrations. After an initial calibration, the program can perform the chemical composition calculated on the basis of a fixed number of oxygens for a selected area. In this way, it is possible to identify all the mineral phases occurring in the sample. Up to three elements can be selected to calculate the modal percentage of the identified mineral. An automated routine scans the whole set of maps and assigns each pixel that satisfies the imposed requirements to the selected phase. Repeating this procedure for every mineral phase occurring in the mapped area, a modal distribution of the rock-forming minerals can be performed. The final output consists of a digitized image, which can be further analyzed by common image analysis software, and a table containing the calculated modal percentages. The method is here applied to a volcanic and a metamorphic rock sample. PMID:12533243

  13. Effect of Intermittent Reinforcement on Acquisition and Retention in Delayed Matching-to-Sample in Pigeons

    ERIC Educational Resources Information Center

    Grant, Douglas S.

    2011-01-01

    Experiments 1 and 2 involved independent groups that received primary reinforcement after a correct match with a probability of 1.0, 0.50 or 0.25. Correct matches that did not produce primary reinforcement produced a conditioned reinforcer. Both experiments revealed little evidence that acquisition or retention was adversely affected by use of…

  14. Characteristics of Marijuana Acquisition among a National Sample of Adolescent Users

    ERIC Educational Resources Information Center

    King, Keith A.; Merianos, Ashley L.; Vidourek, Rebecca A.

    2016-01-01

    Background: Because marijuana is becoming more accessible and perceived norms of use are becoming increasingly more favorable, research is needed to understand characteristics of marijuana acquisition among adolescents. Purpose: The study purpose was to examine whether sources and locations where adolescent users obtain and use marijuana differed…

  15. Ultimate Attainment in Second Language Acquisition: Near-Native Sentence Processing in Spanish

    ERIC Educational Resources Information Center

    Jegerski, Jill

    2010-01-01

    A study of near-native sentence processing was carried out using the self-paced reading method. Twenty-three near-native speakers of Spanish were identified on the basis of native-like proficiency, age of onset of acquisition after 15 years, and a minimum of three years ongoing residency in Spanish-speaking countries. The sentence comprehension…

  16. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C 6.0. This system can be used easily and quickly for drawing grid maps in-field, creating parameters for grid-reorganization, guiding in-field data collection, converting data between diffe...

  17. A Problem-Based Learning Model for Teaching the Instructional Design Business Acquisition Process.

    ERIC Educational Resources Information Center

    Kapp, Karl M.; Phillips, Timothy L.; Wanner, Janice H.

    2002-01-01

    Outlines a conceptual framework for using a problem-based learning model for teaching the Instructional Design Business Acquisition Process. Discusses writing a response to a request for proposal, developing a working prototype, orally presenting the solution, and the impact of problem-based learning on students' perception of their confidence in…

  18. Learning and Individual Differences: An Ability/Information-Processing Framework for Skill Acquisition. Final Report.

    ERIC Educational Resources Information Center

    Ackerman, Phillip L.

    A program of theoretical and empirical research focusing on the ability determinants of individual differences in skill acquisition is reviewed. An integrative framework for information-processing and cognitive ability determinants of skills is reviewed, along with principles for ability-skill relations. Experimental manipulations were used to…

  19. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C6.0. This system can be used easily and quickly for drawing grid maps in-field, making out parameters for grid-reorganization, guiding for in-field data collection, converting data between ...

  20. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  1. Processes of Language Acquisition in Children with Autism: Evidence from Preferential Looking

    ERIC Educational Resources Information Center

    Swensen, Lauren D.; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R.

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency…

  2. Experimental studies on remanence acquisition processes and regional geomagnetic field variability from archeointensity studies

    NASA Astrophysics Data System (ADS)

    Mitra, Ritayan

    The dissertation comprises two separate topics. Chapters 2 and 3 are experimental studies on remanence acquisition processes. Chapters 4 and 5 investigate the geomagnetic field variability in Africa and India between 1000 BCE and 1000 CE. Chapter 2 is a study in which the role of flocculation in sedimentary magnetization is analyzed with the help of laboratory redeposition experiments and a simple numerical model. At small floc sizes DRM acquisition is likely to be non-linear but it may record the directions with higher fidelity. In environments having bigger flocs the sediments are likely to record either intensities or directions with high fidelity, but not both. Also flocculation may inhibit a large fraction of magnetic grains from contributing to the net remanence and this might have consequences for intensity normalization in sediments. Chapter 3 presents a fresh perspective on the long standing debate of the nature of magnetocrystalline anisotropy in Mid-Ocean Ridge Basalts (MORBs). A new parameter, IRAT, defined as the ratio of the isothermal remanences in antiparallel directions is used to differentiate between uniaxial single domain grains (IRAT ˜1) and multiaxial single domain grains (IRAT<1). The theoretical predictions were first validated with standard samples and then multiple MORB samples were analyzed. The observed IRAT ratios indicate a dominant non-uniaxial anisotropy in the MORBs. Chapters 4 and 5 are archeointensity studies from two data poor regions of the world viz., Africa and India. With stringent data selection criteria and well established archeological constraints these datasets provide important constraints on the field intensity from 1000 BCE to 1000 CE in Africa and 500 BCE to 1000 CE in India. The African dataset has a higher age resolution than the Indian dataset. The African dataset matches well with the global CALS3k.4 model and shows significant non-axial-dipolar contribution in the region. The Indian dataset is not of a similar

  3. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 636.602-5 Short selection processes for contracts not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  4. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 636.602-5 Short selection processes for contracts not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  5. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... exceed the simplified acquisition threshold, either or both of the short selection processes set out...

  6. Self-organizing Symbol Acquisition and Motion Generation based on Dynamics-based Information Processing System

    NASA Astrophysics Data System (ADS)

    Okada, Masafumi; Nakamura, Daisuke; Nakamura, Yoshihiko

    The symbol acquisition and manipulation abilities are one of the inherent characteristics of human beings comparing with other creatures. In this paper, based on recurrent self-organizing map and dynamics-based information processing system, we propose a dynamics based self-organizing map (DBSOM). This method enables designing a topological map using time sequence data, which causes recognition and generation of the robot motion. Using this method, we design the self-organizing symbol acquisition system and robot motion generation system for a humanoid robot. By implementing DBSOM to the robot in the real world, we realize the symbol acquisition from the experimental data and investigate the spatial property of the obtained DBSOM.

  7. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  8. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1036.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... process....

  9. Performance of a VME-based parallel processing LIDAR data acquisition system (summary)

    SciTech Connect

    Moore, K.; Buttler, B.; Caffrey, M.; Soriano, C.

    1995-05-01

    It may be possible to make accurate real time, autonomous, 2 and 3 dimensional wind measurements remotely with an elastic backscatter Light Detection and Ranging (LIDAR) system by incorporating digital parallel processing hardware into the data acquisition system. In this paper, we report the performance of a commercially available digital parallel processing system in implementing the maximum correlation technique for wind sensing using actual LIDAR data. Timing and numerical accuracy are benchmarked against a standard microprocessor impementation.

  10. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. PMID:26720258

  11. Advances in diffusion MRI acquisition and processing in the Human Connectome Project.

    PubMed

    Sotiropoulos, Stamatios N; Jbabdi, Saad; Xu, Junqian; Andersson, Jesper L; Moeller, Steen; Auerbach, Edward J; Glasser, Matthew F; Hernandez, Moises; Sapiro, Guillermo; Jenkinson, Mark; Feinberg, David A; Yacoub, Essa; Lenglet, Christophe; Van Essen, David C; Ugurbil, Kamil; Behrens, Timothy E J

    2013-10-15

    The Human Connectome Project (HCP) is a collaborative 5-year effort to map human brain connections and their variability in healthy adults. A consortium of HCP investigators will study a population of 1200 healthy adults using multiple imaging modalities, along with extensive behavioral and genetic data. In this overview, we focus on diffusion MRI (dMRI) and the structural connectivity aspect of the project. We present recent advances in acquisition and processing that allow us to obtain very high-quality in-vivo MRI data, whilst enabling scanning of a very large number of subjects. These advances result from 2 years of intensive efforts in optimising many aspects of data acquisition and processing during the piloting phase of the project. The data quality and methods described here are representative of the datasets and processing pipelines that will be made freely available to the community at quarterly intervals, beginning in 2013. PMID:23702418

  12. Advances in diffusion MRI acquisition and processing in the Human Connectome Project

    PubMed Central

    Sotiropoulos, Stamatios N; Jbabdi, Saad; Xu, Junqian; Andersson, Jesper L; Moeller, Steen; Auerbach, Edward J; Glasser, Matthew F; Hernandez, Moises; Sapiro, Guillermo; Jenkinson, Mark; Feinberg, David A; Yacoub, Essa; Lenglet, Christophe; Ven Essen, David C; Ugurbil, Kamil; Behrens, Timothy EJ

    2013-01-01

    The Human Connectome Project (HCP) is a collaborative 5-year effort to map human brain connections and their variability in healthy adults. A consortium of HCP investigators will study a population of 1200 healthy adults using multiple imaging modalities, along with extensive behavioral and genetic data. In this overview, we focus on diffusion MRI (dMRI) and the structural connectivity aspect of the project. We present recent advances in acquisition and processing that allow us to obtain very high-quality in-vivo MRI data, while enabling scanning of a very large number of subjects. These advances result from 2 years of intensive efforts in optimising many aspects of data acquisition and processing during the piloting phase of the project. The data quality and methods described here are representative of the datasets and processing pipelines that will be made freely available to the community at quarterly intervals, beginning in 2013. PMID:23702418

  13. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  14. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  15. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section...

  16. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5 Short selection process for procurements not to exceed the simplified acquisition threshold. References to FAR...

  17. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process...

  18. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  19. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section...

  20. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5 Short selection process for procurements not to exceed the simplified acquisition threshold. References to FAR...

  1. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process...

  2. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  3. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  4. Modular Automated Processing System (MAPS) for analysis of biological samples.

    SciTech Connect

    Gil, Geun-Cheol; Chirica, Gabriela S.; Fruetel, Julia A.; VanderNoot, Victoria A.; Branda, Steven S.; Schoeniger, Joseph S.; Throckmorton, Daniel J.; Brennan, James S.; Renzi, Ronald F.

    2010-10-01

    We have developed a novel modular automated processing system (MAPS) that enables reliable, high-throughput analysis as well as sample-customized processing. This system is comprised of a set of independent modules that carry out individual sample processing functions: cell lysis, protein concentration (based on hydrophobic, ion-exchange and affinity interactions), interferent depletion, buffer exchange, and enzymatic digestion of proteins of interest. Taking advantage of its unique capacity for enclosed processing of intact bioparticulates (viruses, spores) and complex serum samples, we have used MAPS for analysis of BSL1 and BSL2 samples to identify specific protein markers through integration with the portable microChemLab{trademark} and MALDI.

  5. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    SciTech Connect

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B., Jr.; Penaflor, B.G.

    1999-06-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters.

  6. Distributed real time data processing architecture for the TJ-II data acquisition system

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Barrera, E.; López, S.; Machón, D.; Vega, J.; Sánchez, E.

    2004-10-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions.

  7. Characterization of digital signal processing in the DiDAC data acquisition system

    SciTech Connect

    Parson, J.D.; Olivier, T.L.; Habbersett, R.C.; Martin, J.C.; Wilder, M.E.; Jett, J.H. )

    1993-01-01

    A new generation data acquisition system for flow cytometers has been constructed. This Digital Data Acquisition and Control (DiDAC) system is based on the VME architecture and uses both the standard VME bus and a private bus for system communication and data transfer. At the front end of the system is a free running 20 MHz ADC. The output of a detector preamp provides the signal for digitization. The digitized waveform is passed to a custom built digital signal processing circuit that extracts the height, width, and integral of the waveform. Calculation of these parameters is started (and stopped) when the waveform exceeds (and falls below) a preset threshold value. The free running ADC is specified to have 10 bit accuracy at 25 MHZ. The authors have characterized it to the results obtained with conventional analog signal processing followed by digitization. Comparisons are made between the two approaches in terms of measurement CV, linearity and in other aspects.

  8. Summary of the activities of the subgroup on data acquisition and processing

    SciTech Connect

    Connolly, P.L.; Doughty, D.C.; Elias, J.E.

    1981-01-01

    A data acquisition and handling subgroup consisting of approximately 20 members met during the 1981 ISABELLE summer study. Discussions were led by members of the BNL ISABELLE Data Acquisition Group (DAG) with lively participation from outside users. Particularly large contributions were made by representatives of BNL experiments 734, 735, and the MPS, as well as the Fermilab Colliding Detector Facility and the SLAC LASS Facility. In contrast to the 1978 study, the subgroup did not divide its activities into investigations of various individual detectors, but instead attempted to review the current state-of-the-art in the data acquisition, trigger processing, and data handling fields. A series of meetings first reviewed individual pieces of the problem, including status of the Fastbus Project, the Nevis trigger processor, the SLAC 168/E and 3081/E emulators, and efforts within DAG. Additional meetings dealt with the question involving specifying and building complete data acquisition systems. For any given problem, a series of possible solutions was proposed by the members of the subgroup. In general, any given solution had both advantages and disadvantages, and there was never any consensus on which approach was best. However, there was agreement that certain problems could only be handled by systems of a given power or greater. what will be given here is a review of various solutions with associated powers, costs, advantages, and disadvantages.

  9. Digital signal processing and data acquisition employing diode lasers for lidar-hygrometer

    NASA Astrophysics Data System (ADS)

    Naboko, Sergei V.; Pavlov, Lyubomir Y.; Penchev, Stoyan P.; Naboko, Vassily N.; Pencheva, Vasilka H.; Donchev, T.

    2003-11-01

    The paper refers to novel aspects of application of the laser radar (LIDAR) to differential absorption spectroscopy and atmospheric gas monitoring, accenting on the advantages of the class of powerful pulsed laser diodes. The implementation of the task for determination of atmospheric humidity, which is a major green house gas, and the set demands of measurement match well the potential of the acquisition system. The projected system is designed by transmission of the operations to Digital Signal Processing (DSP) module allowing preservation of the informative part of the signal by real-time pre-processing and following post-processing by personal computer.

  10. Mobile digital data acquisition and recording system for geoenergy process monitoring and control

    SciTech Connect

    Kimball, K B; Ogden, H C

    1980-12-01

    Three mobile, general purpose data acquisition and recording systems have been built to support geoenergy field experiments. These systems were designed to record and display information from large assortments of sensors used to monitor in-situ combustion recovery or similar experiments. They provide experimenters and operations personnel with easy access to current and past data for evaluation and control of the process, and provide permanent recordings for subsequent detailed analysis. The configurations of these systems and their current capabilities are briefly described.

  11. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  12. Instrumental improvements and sample preparations that enable reproducible, reliable acquisition of mass spectra from whole bacterial cells

    PubMed Central

    Alusta, Pierre; Buzatu, Dan; Williams, Anna; Cooper, Willie-Mae; Tarasenko, Olga; Dorey, R Cameron; Hall, Reggie; Parker, W Ryan; Wilkes, Jon G

    2015-01-01

    Rationale Rapid sub-species characterization of pathogens is required for timely responses in outbreak situations. Pyrolysis mass spectrometry (PyMS) has the potential to be used for this purpose. Methods However, in order to make PyMS practical for traceback applications, certain improvements related to spectrum reproducibility and data acquisition speed were required. The main objectives of this study were to facilitate fast detection (<30 min to analyze 6 samples, including preparation) and sub-species-level bacterial characterization based on pattern recognition of mass spectral fingerprints acquired from whole cells volatilized and ionized at atmospheric pressure. An AccuTOF DART mass spectrometer was re-engineered to permit ionization of low-volatility bacteria by means of Plasma Jet Ionization (PJI), in which an electric discharge, and, by extension, a plasma beam, impinges on sample cells. Results Instrumental improvements and spectral acquisition methodology are described. Performance of the re-engineered system was assessed using a small challenge set comprised of assorted bacterial isolates differing in identity by varying amounts. In general, the spectral patterns obtained allowed differentiation of all samples tested, including those of the same genus and species but different serotypes. Conclusions Fluctuations of ±15% in bacterial cell concentrations did not substantially compromise replicate spectra reproducibility. © 2015 National Center for Toxicological Research. Rapid Communications in Mass Spectrometry published by John Wiley & Sons Ltd. PMID:26443394

  13. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    SciTech Connect

    Schickert, Martin

    2015-03-31

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  14. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    NASA Astrophysics Data System (ADS)

    Schickert, Martin

    2015-03-01

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  15. Valve For Extracting Samples From A Process Stream

    NASA Technical Reports Server (NTRS)

    Callahan, Dave

    1995-01-01

    Valve for extracting samples from process stream includes cylindrical body bolted to pipe that contains stream. Opening in valve body matched and sealed against opening in pipe. Used to sample process streams in variety of facilities, including cement plants, plants that manufacture and reprocess plastics, oil refineries, and pipelines.

  16. Revised sampling campaigns to provide sludge for treatment process testing

    SciTech Connect

    PETERSEN, C.A.

    1999-02-18

    The purpose of this document is to review the impact to the sludge sampling campaigns planned for FY 1999 given the recent decision to delete any further sludge sampling in the K West Basin. Requirements for Sludge sample material for Sludge treatment process testing are reviewed. Options are discussed for obtaining the volume sample material required and an optimized plan for obtaining this sludge is summarized.

  17. The Earthscope USArray Array Network Facility (ANF): Evolution of Data Acquisition, Processing, and Storage Systems

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Battistuz, B.; Foley, S.; Vernon, F. L.; Eakins, J. A.

    2009-12-01

    Since April 2004 the Earthscope USArray Transportable Array (TA) network has grown to over 400 broadband seismic stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. In total, over 1.7 terabytes per year of 24-bit, 40 samples-per-second seismic and state of health data is recorded from the stations. The ANF provides analysts access to real-time and archived data, as well as state-of-health data, metadata, and interactive tools for station engineers and the public via a website. Additional processing and recovery of missing data from on-site recorders (balers) at the stations is performed before the final data is transmitted to the IRIS Data Management Center (DMC). Assembly of the final data set requires additional storage and processing capabilities to combine the real-time data with baler data. The infrastructure supporting these diverse computational and storage needs currently consists of twelve virtualized Sun Solaris Zones executing on nine physical server systems. The servers are protected against failure by redundant power, storage, and networking connections. Storage needs are provided by a hybrid iSCSI and Fiber Channel Storage Area Network (SAN) with access to over 40 terabytes of RAID 5 and 6 storage. Processing tasks are assigned to systems based on parallelization and floating-point calculation needs. On-site buffering at the data-loggers provide protection in case of short-term network or hardware problems, while backup acquisition systems at the San Diego Supercomputer Center and the DMC protect against catastrophic failure of the primary site. Configuration management and monitoring of these systems is accomplished with open-source (Cfengine, Nagios, Solaris Community Software) and commercial tools (Intermapper). In the evolution from a single server to multiple virtualized server instances, Sun Cluster software was evaluated and found to be unstable in our environment. Shared filesystem

  18. Hardware System for Real-Time EMG Signal Acquisition and Separation Processing during Electrical Stimulation.

    PubMed

    Hsueh, Ya-Hsin; Yin, Chieh; Chen, Yan-Hong

    2015-09-01

    The study aimed to develop a real-time electromyography (EMG) signal acquiring and processing device that can acquire signal during electrical stimulation. Since electrical stimulation output can affect EMG signal acquisition, to integrate the two elements into one system, EMG signal transmitting and processing method has to be modified. The whole system was designed in a user-friendly and flexible manner. For EMG signal processing, the system applied Altera Field Programmable Gate Array (FPGA) as the core to instantly process real-time hybrid EMG signal and output the isolated signal in a highly efficient way. The system used the power spectral density to evaluate the accuracy of signal processing, and the cross correlation showed that the delay of real-time processing was only 250 μs. PMID:26210898

  19. Age Effects on the Process of L2 Acquisition? Evidence from the Acquisition of Negation and Finiteness in L2 German

    ERIC Educational Resources Information Center

    Dimroth, Christine

    2008-01-01

    It is widely assumed that ultimate attainment in adult second language (L2) learners often differs quite radically from ultimate attainment in child L2 learners. This article addresses the question of whether learners at different ages also show qualitative differences in the process of L2 acquisition. Longitudinal production data from two…

  20. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  1. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions. PMID:16941992

  2. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  3. Autonomous Closed-Loop Tasking, Acquisition, Processing, and Evaluation for Situational Awareness Feedback

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Dan; Cappelaere, Pat

    2016-01-01

    This presentation describes the closed loop satellite autonomy methods used to connect users and the assets on Earth Orbiter- 1 (EO-1) and similar satellites. The base layer is a distributed architecture based on Goddard Mission Services Evolution Concept (GMSEC) thus each asset still under independent control. Situational awareness is provided by a middleware layer through common Application Programmer Interface (API) to GMSEC components developed at GSFC. Users setup their own tasking requests, receive views into immediate past acquisitions in their area of interest, and into future feasibilities for acquisition across all assets. Automated notifications via pubsub feeds are returned to users containing published links to image footprints, algorithm results, and full data sets. Theme-based algorithms are available on-demand for processing.

  4. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  5. On the Contrastive Analysis of Features in Second Language Acquisition: Uninterpretable Gender on Past Participles in English-French Processing

    ERIC Educational Resources Information Center

    Dekydtspotter, Laurent; Renaud, Claire

    2009-01-01

    Lardiere's discussion raises important questions about the use of features in second language (L2) acquisition. This response examines predictions for processing of a feature-valuing model vs. a frequency-sensitive, associative model in explaining the acquisition of French past participle agreement. Results from a reading-time experiment support…

  6. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1036.602-5 Short selection...

  7. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5...

  8. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  9. Sample Handling and Processing on Mars for Future Astrobiology Missions

    NASA Technical Reports Server (NTRS)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  10. Fault recognition depending on seismic acquisition and processing for application to geothermal exploration

    NASA Astrophysics Data System (ADS)

    Buness, H.; von Hartmann, H.; Rumpel, H.; Krawczyk, C. M.; Schulz, R.

    2011-12-01

    Fault systems offer a large potential for deep hydrothermal energy extraction. Most of the existing and planned projects rely on enhanced permeability assumed to be connected with them. Target depth of hydrothermal exploration in Germany is in the order of 3 -5 km to ensure an economic operation due to moderate temperature gradients. 3D seismics is the most appropriate geophysical method to image fault systems at these depth, but also one of the most expensive ones. It constitutes a significant part of the total project costs, so its application was (and is) discussed. Cost reduction in principle can be achieved by sparse acquisition. However, the decreased fold inevitably leads to a decreased S/N ratio. To overcome this problem, the application of the CRS (Common Reflection Surface) method has been proposed. The stacking operator of the CRS method inherently includes more traces than the conventional NMO/DMO stacking operator and hence a better S/N ratio can be achieved. We tested this approach using exiting 3D seismic datasets of the two most important hydrothermal provinces in Germany, the Upper Rhine Graben (URG) and the German Molasse Basin (GMB). To simulate a sparse acquisition, we reduced the amount of data to a quarter respectively a half and did a reprocessing of the data, including new velocity analysis and residual static corrections. In the URG, the utilization of the variance cube as basis for a horizon bound window amplitude analysis has been successful for the detection of small faults, which would hardly be recognized in seismic sections. In both regions, CRS processing undoubtedly improved the imaging of small faults in the complete as well as in the reduced versions of the datasets. However, CRS processing could not compensate the loss of resolution due to the reduction associated with the simulated sparse acquisition, and hence smaller faults became undetectable. The decision for a sparse acquisition of course depends on the scope of the survey

  11. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future. PMID:25430050

  12. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees. PMID:19727866

  13. Memory acquisition and retrieval impact different epigenetic processes that regulate gene expression

    PubMed Central

    2015-01-01

    Background A fundamental question in neuroscience is how memories are stored and retrieved in the brain. Long-term memory formation requires transcription, translation and epigenetic processes that control gene expression. Thus, characterizing genome-wide the transcriptional changes that occur after memory acquisition and retrieval is of broad interest and importance. Genome-wide technologies are commonly used to interrogate transcriptional changes in discovery-based approaches. Their ability to increase scientific insight beyond traditional candidate gene approaches, however, is usually hindered by batch effects and other sources of unwanted variation, which are particularly hard to control in the study of brain and behavior. Results We examined genome-wide gene expression after contextual conditioning in the mouse hippocampus, a brain region essential for learning and memory, at all the time-points in which inhibiting transcription has been shown to impair memory formation. We show that most of the variance in gene expression is not due to conditioning and that by removing unwanted variance through additional normalization we are able provide novel biological insights. In particular, we show that genes downregulated by memory acquisition and retrieval impact different functions: chromatin assembly and RNA processing, respectively. Levels of histone 2A variant H2AB are reduced only following acquisition, a finding we confirmed using quantitative proteomics. On the other hand, splicing factor Rbfox1 and NMDA receptor-dependent microRNA miR-219 are only downregulated after retrieval, accompanied by an increase in protein levels of miR-219 target CAMKIIγ. Conclusions We provide a thorough characterization of coding and non-coding gene expression during long-term memory formation. We demonstrate that unwanted variance dominates the signal in transcriptional studies of learning and memory and introduce the removal of unwanted variance through normalization as a

  14. Double Shell Tank (DST) Process Waste Sampling Subsystem Specification

    SciTech Connect

    RASMUSSEN, J.H.

    2000-05-03

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Process Waste Sampling Subsystem which supports the first phase of Waste Feed Delivery.

  15. Monitoring of Extraction Efficiency by a Sample Process Control Virus Added Immediately Upon Sample Receipt.

    PubMed

    Ruhanya, Vurayai; Diez-Valcarce, Marta; D'Agostino, Martin; Cook, Nigel; Hernández, Marta; Rodríguez-Lázaro, David

    2015-12-01

    When analysing food samples for enteric viruses, a sample process control virus (SPCV) must be added at the commencement of the analytical procedure, to verify that the analysis has been performed correctly. Samples can on occasion arrive at the laboratory late in the working day or week. The analyst may consequently have insufficient time to commence and complete the complex procedure, and the samples must consequently be stored. To maintain the validity of the analytical result, it will be necessary to consider storage as part of the process, and the analytical procedure as commencing on sample receipt. The aim of this study was to verify that an SPCV can be recovered after sample storage, and thus indicate the effective recovery of enteric viruses. Two types of samples (fresh and frozen raspberries) and two types of storage (refrigerated and frozen) were studied using Mengovirus vMC0 as SPCV. SPCV recovery was not significantly different (P > 0.5) regardless of sample type or duration of storage (up to 14 days at -20 °C). Accordingly, samples can be stored without a significant effect on the performance of the analysis. The results of this study should assist the analyst by demonstrating that they can verify that viruses can be extracted from food samples even if samples have been stored. PMID:26297430

  16. Object-oriented programming approach to CCD data acquisition and image processing

    NASA Astrophysics Data System (ADS)

    Naidu, B. Nagaraja; Srinivasan, R.; Shankar, S. Murali

    1997-10-01

    In the recent past both the CCD camera controller hardware and software have witnessed a dynamic change to keep pace with the astronomer's imaging requirements. Conventional data acquisition software is based on menu driven programs developed using structured high level languages in non-window environment. An application under windows offers several advantages to the users, over the non-window approach, like multitasking, accessing large memory and inter-application communication. Windows also provides many programming facilities to the developers such as device-independent graphics, support to wide range of input/output devices, menus, icons, bitmaps. However, programming for windows environment under structured programming demands an in-depth knowledge of events, formats, handles and inner workings. Object-oriented approach simplifies the task of programming for windows by using object windows which manage the message- processing behavior and insulate the developer from the details of inner workings of windows. As a result, a window application can be developed in much less time and effort compared to conventional approaches. We have designed and developed an easy-to-use CCD data acquisition and processing software under Microsoft Windows 3.1 operating environment using object-Pascal for windows. The acquisition software exploits the advantages of the objects to provide custom specific tool boxes to implement different functions of CCD data accusation and image processing. In this paper the hierarchy of the software structure and various application functions are presented. The flexibility of the software to handle different CCDs and also mosaic arrangement is illustrated.

  17. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    SciTech Connect

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  18. Acquisition and Processing of Multi-Fold GPR Data for Characterization of Shallow Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Bradford, J. H.

    2004-05-01

    Most ground-penetrating radar (GPR) data are acquired with a constant transmitter-receiver offset and often investigators apply little or no processing in generating a subsurface image. This mode of operation can provide useful information, but does not take full advantage of the information the GPR signal can carry. In continuous multi-offset (CMO) mode, one acquires several traces with varying source-receiver separations at each point along the survey. CMO acquisition is analogous to common-midpoint acquisition in exploration seismology and gives rise to improved subsurface characterization through three key features: 1) Processes such as stacking and velocity filtering significantly attenuate coherent and random noise resulting in subsurface images that are easier to interpret, 2) CMO data enable measurement of vertical and lateral velocity variations which leads to improved understanding of material distribution and more accurate depth estimates, and 3) CMO data enable observation of reflected wave behaviour (ie variations in amplitude and spectrum) at a common reflection point for various travel paths through the subsurface - quantification of these variations can be a valuable tool in material property characterization. Although there are a few examples in the literature, investigators rarely acquire CMO GPR data. This is, in large part, due to the fact that CMO acquisition with a single channel system is labor intensive and time consuming. At present, no multi-channel GPR systems designed for CMO acquisition are commercially available. Over the past 8 years I have designed, conducted, and processed numerous 2D and 3D CMO GPR surveys using a single channel GPR system. I have developed field procedures that enable a three man crew to acquire CMO GPR data at a rate comparable to a similar scale multi-channel seismic reflection survey. Additionally, many recent advances in signal processing developed in the oil and gas industry have yet to see significant

  19. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  20. Condom acquisition and preferences within a sample of sexually active gay and bisexual men in the southern United States.

    PubMed

    Rhodes, Scott D; Hergenrather, Kenneth C; Yee, Leland J; Wilkin, Aimee M; Clarke, Thomas L; Wooldredge, Rich; Brown, Monica; Davis, A Bernard

    2007-11-01

    Health departments, community-based organizations (CBOs), and AIDS service organizations (ASOs) in the United States and abroad distribute large quantities of free condoms to sexually active individuals; however, little is known about where individuals who use condoms actually acquire them. This community-based participatory research (CBPR) study was designed to identify factors associated with the use of free condoms during most recent anal intercourse among self-identifying gay and bisexual men who reported condom use. Data were collected using targeted intercept interviewing during North Carolina Pride Festival events in Fall 2006, using the North Carolina Condom Acquisition and Preferences Assessment (NC-CAPA). Of the 606 participants who completed the assessment, 285 met the inclusion criteria. Mean age of participants was 33 (+/-10.8) years. The sample was predominantly white (80%), 50% reported being single or not dating anyone special, and 38% reported the use of free condoms during most recent anal intercourse. In multivariable analysis, participants who reported using free condoms during most recent anal sex were more likely to report increased age; dating someone special or being partnered; and having multiple male sexual partners in the past 3 months. These participants were less likely to report ever having had a sexually transmitted disease. Despite being in the third decade of the HIV epidemic, little is known about condom acquisition among, and condom preferences of, gay and bisexual men who use condoms. Although more research is needed, our findings illustrate the importance of free condom distribution. PMID:18240895

  1. Sample fields of the Viking landers, physical properties, and aeolian processes

    NASA Technical Reports Server (NTRS)

    Moore, H. J.; Spitzer, C. R.; Bradford, K. Z.; Cates, P. M.; Shorthill, R. W.; Hutton, R. E.

    1979-01-01

    Surface sampler activities on Mars during the Viking extended mission are considered, including excavation of deep trenches, construction of conical piles of materials, backhoe touchdown experiments, and acquisition of contiguous pictures of the surface beneath number 2 terminal descent engines using mirrors. Results of the Physical Properties Investigation that are relevant to aeolian processes are also discussed. Both pictures and surface sampler data indicate that the surface materials in the sample fields of the Viking landers may be grouped, in order of increasing strength, into drift material, crusty to cloddy material, blocky material and rocks.

  2. Acquisition of a High Resolution Field Emission Scanning Electron Microscope for the Analysis of Returned Samples

    NASA Technical Reports Server (NTRS)

    Nittler, Larry R.

    2003-01-01

    This grant furnished funds to purchase a state-of-the-art scanning electron microscope (SEM) to support our analytical facilities for extraterrestrial samples. After evaluating several instruments, we purchased a JEOL 6500F thermal field emission SEM with the following analytical accessories: EDAX energy-dispersive x-ray analysis system with fully automated control of instrument and sample stage; EDAX LEXS wavelength-dispersive x-ray spectrometer for high sensitivity light-element analysis; EDAX/TSL electron backscatter diffraction (EBSD) system with software for phase identification and crystal orientation mapping; Robinson backscatter electron detector; and an in situ micro-manipulator (Kleindiek). The total price was $550,000 (with $150,000 of the purchase supported by Carnegie institution matching funds). The microscope was delivered in October 2002, and most of the analytical accessories were installed by January 2003. With the exception of the wavelength spectrometer (which has been undergoing design changes) everything is working well and the SEM is in routine use in our laboratory.

  3. An Integrated Data Acquisition / User Request/ Processing / Delivery System for Airborne Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Chapman, B.; Chu, A.; Tung, W.

    2003-12-01

    Airborne science data has historically played an important role in the development of the scientific underpinnings for spaceborne missions. When the science community determines the need for new types of spaceborne measurements, airborne campaigns are often crucial in risk mitigation for these future missions. However, full exploitation of the acquired data may be difficult due to its experimental and transitory nature. Externally to the project, most problematic (in particular, for those not involved in requesting the data acquisitions) may be the difficulty in searching for, requesting, and receiving the data, or even knowing the data exist. This can result in a rather small, insular community of users for these data sets. Internally, the difficulty for the project is in maintaining a robust processing and archival system during periods of changing mission priorities and evolving technologies. The NASA/JPL Airborne Synthetic Aperture Radar (AIRSAR) has acquired data for a large and varied community of scientists and engineers for 15 years. AIRSAR is presently supporting current NASA Earth Science Enterprise experiments, such as the Soil Moisture EXperiment (SMEX) and the Cold Land Processes experiment (CLPX), as well as experiments conducted as many as 10 years ago. During that time, it's processing, data ordering, and data delivery system has undergone evolutionary change as the cost and capability of resources has improved. AIRSAR now has a fully integrated data acquisition/user request/processing/delivery system through which most components of the data fulfillment process communicate via shared information within a database. The integration of these functions has reduced errors and increased throughput of processed data to customers.

  4. A CCD/CMOS process for integrated image acquisition and early vision signal processing

    NASA Astrophysics Data System (ADS)

    Keast, Craig L.; Sodini, Charles G.

    The development of technology which integrates a four phase, buried-channel CCD in an existing 1.75 micron CMOS process is described. The four phase clock is employed in the integrated early vision system to minimize process complexity. Signal corruption is minimized and lateral fringing fields are enhanced by burying the channel. The CMOS process for CCD enhancement is described, which highlights a new double-poly process and the buried channel, and the integration is outlined. The functionality and transfer efficiency of the process enhancement were appraised by measuring CCD shift registers at 100 kHz. CMOS measurement results are presented, which include threshold voltages, poly-to-poly capacitor voltage and temperature coefficients, and dark current. A CCD/CMOS processor is described which combines smoothing and segmentation operations. The integration of the CCD and the CMOS processes is found to function due to the enhancement-compatible design of the CMOS process and the thorough employment of CCD module baseline process steps.

  5. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  6. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  7. Exploitation of realistic computational anthropomorphic phantoms for the optimization of nuclear imaging acquisition and processing protocols.

    PubMed

    Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C

    2014-01-01

    Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies. PMID:25570355

  8. Medical Knowledge Base Acquisition: The Role of the Expert Review Process in Disease Profile Construction

    PubMed Central

    Giuse, Nunzia Bettinsoli; Bankowitz, Richard A.; Giuse, Dario A.; Parker, Ronnie C.; Miller, Randolph A.

    1989-01-01

    In order to better understand the knowledge acquisition process, we studied the changes which a newly developed “preliminary” QMR disease profile undergoes during the expert review process. Changes in the ten most recently created disease profiles from the INTERNIST-1/QMR knowledge base were analyzed. We classified the changes which occurred during knowledge base construction by the type of change and the reason for the change. Observed changes to proposed findings could be grouped according to whether a change was needed to maintain consistency with the existing knowledge base, or because of disagreement over knowledge content with the domain expert. Out of 987 total proposed findings in the ten profiles, 233 findings underwent 274 changes, approximately one change for each three proposed findings. A total of 43% of the changes were additions or deletions of findings or links compared to the preliminary disease profile, and 33% of the changes were alterations in the numerical value of the evoking strength or frequency. A total of 126 (46%) of changes were required to maintain consistency of the knowledge base, whereas the remaining 148 (54%) changes were altered based on suggestions made by the domain expert based on domain content. The type of change (consistency vs. domain knowledge) was found to correlate both with the class of finding (newly constructed vs. previously used) and with the experience of the profiler (novice vs. experienced). These differences suggest that some but not all aspects of the disease profiling process can be improved upon with experience. Since it is generally agreed that the construction of a knowledge base depends heavily upon the knowledge acquisition process, this study provides some insight into areas of investigation for others interested in the construction of automated tools to aid the process of knowledge base construction. It also provides support for the observation that knowledge base construction has at least some

  9. Preliminary study of the EChO data sampling and processing

    NASA Astrophysics Data System (ADS)

    Farina, M.; Di Giorgio, A. M.; Focardi, M.; Pace, E.; Micela, G.; Galli, E.; Giusi, G.; Liu, S. J.; Pezzuto, S.

    2014-08-01

    The EChO Payload is an integrated spectrometer with six different channels covering the spectral range from the visible up to the thermal infrared. A common Instrument Control Unit (ICU) implements all the instrument control and health monitoring functionalities as well as all the onboard science data processing. To implement an efficient design of the ICU on board software, separate analysis of the unit requirements are needed for the commanding and housekeeping collection as well as for the data acquisition, sampling and compression. In this work we present the results of the analysis carried out to optimize the EChO data acquisition and processing chain. The HgCdTe detectors used for EChO mission allow for non-destructive readout modes, such that the charge may be read without removing it after reading out. These modes can reduce the equivalent readout noise and the gain in signal to noise ratio can be computed using well known relations based on fundamental principles. In particular, we considered a multiaccumulation approach based on non-destructive reading of detector samples taken at equal time intervals. All detectors are periodically reset after a certain number of samples have been acquired and the length of the reset interval, as well as the number of samples and the sampling rate can be adapted to the brightness of the considered source. The estimation of the best set of parameters for the signal to noise ratio optimization and of the best sampling technique has been done by taking into account also the needs of mitigating the expected radiation effects on the acquired data. Cosmic rays can indeed be one of the major sources of data loss for a space observatory, and the studies made for the JWST mission allowed us to evaluate the actual need of the implementation of a dedicated deglitching procedure on board EChO.

  10. Space processing applications payload equipment study. Volume 2C: Data acquisition and process control

    NASA Technical Reports Server (NTRS)

    Kayton, M.; Smith, A. G.

    1974-01-01

    The services provided by the Spacelab Information Management System are discussed. The majority of the services are provided by the common-support subsystems in the Support Module furnished by the Spacelab manufacturer. The information processing requirements for the space processing applications (SPA) are identified. The requirements and capabilities for electric power, display and control panels, recording and telemetry, intercom, and closed circuit television are analyzed.

  11. A review of breast tomosynthesis. Part I. The image acquisition process

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process. PMID:23298126

  12. A review of breast tomosynthesis. Part I. The image acquisition process

    SciTech Connect

    Sechopoulos, Ioannis

    2013-01-15

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

  13. Acquisition of material properties in production for sheet metal forming processes

    SciTech Connect

    Heingärtner, Jörg; Hora, Pavel; Neumann, Anja; Hortig, Dirk; Rencki, Yasar

    2013-12-16

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  14. Acquisition of material properties in production for sheet metal forming processes

    NASA Astrophysics Data System (ADS)

    Heingärtner, Jörg; Neumann, Anja; Hortig, Dirk; Rencki, Yasar; Hora, Pavel

    2013-12-01

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  15. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  16. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach. PMID:27561252

  17. A review of breast tomosynthesis. Part I. The image acquisition process.

    PubMed

    Sechopoulos, Ioannis

    2013-01-01

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process. PMID:23298126

  18. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  19. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    SciTech Connect

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from the two locations

  20. Apparatus and process for collection of gas and vapor samples

    DOEpatents

    Jackson, Dennis G.; Peterson, Kurt D.; Riha, Brian D.

    2008-04-01

    A gas sampling apparatus and process is provided in which a standard crimping tool is modified by an attached collar. The collar permits operation of the crimping tool while also facilitating the introduction of a supply of gas to be introduced into a storage vial. The introduced gas supply is used to purge ambient air from a collection chamber and an interior of the sample vial. Upon completion of the purging operation, the vial is sealed using the crimping tool.

  1. TestSTORM: Simulator for optimizing sample labeling and image acquisition in localization based super-resolution microscopy.

    PubMed

    Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E; Kaminski, Clemens F; Szabó, Gábor; Erdélyi, Miklós

    2014-03-01

    Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813

  2. Relationships among process skills development, knowledge acquisition, and gender in microcomputer-based chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Krieger, Carla Repsher

    This study investigated how instruction in MBL environments can be designed to facilitate process skills development and knowledge acquisition among high school chemistry students. Ninety-eight college preparatory chemistry students in six intact classes were randomly assigned to one of three treatment groups: MBL with enhanced instruction in Macroscopic knowledge, MBL with enhanced instruction in Microscopic knowledge, and MBL with enhanced instruction in Symbolic knowledge. Each treatment group completed a total of four MBL titrations involving acids and bases. After the first and third titrations, the Macroscopic, Microscopic and Symbolic groups received enhanced instruction in the Macroscopic, Microscopic and Symbolic modes, respectively. During each titration, participants used audiotapes to record their verbal interactions. The study also explored the effects of three potential covariates (age, mathematics background, and computer usage) on the relationships among the independent variables (type of enhanced instruction and gender) and the dependent variables (science process skills and knowledge acquisition). Process skills were measured via gain scores on a standardized test. Analysis of Covariance eliminated age, mathematics background, and computer usage as covariates in this study. Analysis of Variance identified no significant effects on process skills attributable to treatment or gender. Knowledge acquisition was assessed via protocol analysis of statements made by the participants during the four titrations. Statements were categorized as procedural, observational, conceptual/analytical, or miscellaneous. Statement category percentages were analyzed for trends across treatments, genders, and experiments. Instruction emphasizing the Macroscopic mode may have increased percentages of observational and miscellaneous statements and decreased percentages of procedural and conceptual/analytical statements. Instruction emphasizing the Symbolic mode may have

  3. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  4. EFFECT ON PERFUSION VALUES OF SAMPLING INTERVAL OF CT PERFUSION ACQUISITIONS IN NEUROENDOCRINE LIVER METASTASES AND NORMAL LIVER

    PubMed Central

    Ng, Chaan S.; Hobbs, Brian P.; Wei, Wei; Anderson, Ella F.; Herron, Delise H.; Yao, James C.; Chandler, Adam G.

    2014-01-01

    Objective To assess the effects of sampling interval (SI) of CT perfusion acquisitions on CT perfusion values in normal liver and liver metastases from neuroendocrine tumors. Methods CT perfusion in 16 patients with neuroendocrine liver metastases were analyzed by distributed parameter modeling to yield tissue blood flow, blood volume, mean transit time, permeability, and hepatic arterial fraction, for tumor and normal liver. CT perfusion values for the reference sampling interval of 0.5s (SI0.5) were compared with those of SI datasets of 1s, 2s, 3s and 4s, using mixed-effects model analyses. Results Increases in SI beyond 1s were associated with significant and increasing departures of CT perfusion parameters from reference values at SI0.5 (p≤0.0009). CT perfusion values deviated from reference with increasing uncertainty with increasing SIs. Findings for normal liver were concordant. Conclusion Increasing SIs beyond 1s yield significantly different CT perfusion parameter values compared to reference values at SI0.5. PMID:25626401

  5. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  6. Relative efficiency of Gaussian stochastic process sampling procedures

    NASA Astrophysics Data System (ADS)

    Cameron, Chris

    2003-12-01

    Various methods for sampling stationary, Gaussian stochastic processes are investigated and compared with an emphasis on applications to processes with power law energy spectra. Several approaches are considered, including a Riemann summation using left endpoints, the use of random wave numbers to sample a the spectrum in proportion to the energy it contains, and a combination of the two. The Fourier-wavelet method of Elliott et al. is investigated and compared with other methods, all of which are evaluated in terms of their ability to sample the stochastic process over a large number of decades for a given computational cost. The Fourier-wavelet method has accuracy which increases linearly with the computational complexity, while the accuracy of the other methods grows logarithmically. For the Kolmogorov spectrum, a hybrid quadrature method is as efficient as the Fourier-wavelet method, if no more than eight decades of accuracy are required. The effectiveness of this hybrid method wanes when one samples fields whose energy spectrum decays more rapidly near the origin. The Fourier-wavelet method has roughly the same behavior independently of the exponent of the power law. The Fourier-wavelet method returns samples which are Gaussian over the range of values where the structure function is well approximated. By contrast, (multi-point) Gaussianity may be lost at the smaller length scales when one uses methods with random wave numbers.

  7. Automatic optimization of metrology sampling scheme for advanced process control

    NASA Astrophysics Data System (ADS)

    Chue, Chuei-Fu; Huang, Chun-Yen; Shih, Chiang-Lin

    2011-03-01

    In order to ensure long-term profitability, driving the operational costs down and improving the yield of a DRAM manufacturing process are continuous efforts. This includes optimal utilization of the capital equipment. The costs of metrology needed to ensure yield are contributing to the overall costs. As the shrinking of device dimensions continues, the costs of metrology are increasing because of the associated tightening of the on-product specifications requiring more metrology effort. The cost-of-ownership reduction is tackled by increasing the throughput and availability of metrology systems. However, this is not the only way to reduce metrology effort. In this paper, we discuss how the costs of metrology can be improved by optimizing the recipes in terms of the sampling layout, thereby eliminating metrology that does not contribute to yield. We discuss results of sampling scheme optimization for on-product overlay control of two DRAM manufacturing processes at Nanya Technology Corporation. For a 6x DRAM production process, we show that the reduction of metrology waste can be as high as 27% and overlay can be improved by 36%, comparing with a baseline sampling scheme. For a 4x DRAM process, having tighter overlay specs, a gain of ca. 0.5nm on-product overlay could be achieved, without increasing the metrology effort relative to the original sampling plan.

  8. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  9. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    NASA Astrophysics Data System (ADS)

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, A. M.; Li, L.; Kao, C.-M.

    2015-06-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 T small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2×8 LYSO scintillators (5.0×5.0×10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner.

  10. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    PubMed Central

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, Alice M.; Li, Limin; Kao, C.-M.

    2014-01-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 Tesla small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2 × 8 LYSO scintillators (5.0 × 5.0 × 10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner. PMID:25937685

  11. Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples

    NASA Technical Reports Server (NTRS)

    Sunshine, Daniel

    2010-01-01

    The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.

  12. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  13. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    SciTech Connect

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  14. How does the interaction between spelling and motor processes build up during writing acquisition?

    PubMed

    Kandel, Sonia; Perret, Cyril

    2015-03-01

    How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent. PMID:25525970

  15. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  16. 77 FR 9617 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... WorkFlow to process vouchers. DATES: Comments on the proposed rule published January 19, 2012, at 77 FR... clarifying the proposed rule published on January 19, 2012 (77 FR 2682), which proposes to revise... Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054) AGENCY: Defense Acquisition...

  17. Automation of sample plan creation for process model calibration

    NASA Astrophysics Data System (ADS)

    Oberschmidt, James; Abdo, Amr; Desouky, Tamer; Al-Imam, Mohamed; Krasnoperova, Azalia; Viswanathan, Ramya

    2010-04-01

    The process of preparing a sample plan for optical and resist model calibration has always been tedious. Not only because it is required to accurately represent full chip designs with countless combinations of widths, spaces and environments, but also because of the constraints imposed by metrology which may result in limiting the number of structures to be measured. Also, there are other limits on the types of these structures, and this is mainly due to the accuracy variation across different types of geometries. For instance, pitch measurements are normally more accurate than corner rounding. Thus, only certain geometrical shapes are mostly considered to create a sample plan. In addition, the time factor is becoming very crucial as we migrate from a technology node to another due to the increase in the number of development and production nodes, and the process is getting more complicated if process window aware models are to be developed in a reasonable time frame, thus there is a need for reliable methods to choose sample plans which also help reduce cycle time. In this context, an automated flow is proposed for sample plan creation. Once the illumination and film stack are defined, all the errors in the input data are fixed and sites are centered. Then, bad sites are excluded. Afterwards, the clean data are reduced based on geometrical resemblance. Also, an editable database of measurement-reliable and critical structures are provided, and their percentage in the final sample plan as well as the total number of 1D/2D samples can be predefined. It has the advantage of eliminating manual selection or filtering techniques, and it provides powerful tools for customizing the final plan, and the time needed to generate these plans is greatly reduced.

  18. Survival-time statistics for sample space reducing stochastic processes

    NASA Astrophysics Data System (ADS)

    Yadav, Avinash Chand

    2016-04-01

    Stochastic processes wherein the size of the state space is changing as a function of time offer models for the emergence of scale-invariant features observed in complex systems. I consider such a sample-space reducing (SSR) stochastic process that results in a random sequence of strictly decreasing integers {x (t )},0 ≤t ≤τ , with boundary conditions x (0 )=N and x (τ ) = 1. This model is shown to be exactly solvable: PN(τ ) , the probability that the process survives for time τ is analytically evaluated. In the limit of large N , the asymptotic form of this probability distribution is Gaussian, with mean and variance both varying logarithmically with system size: <τ >˜lnN and στ2˜lnN . Correspondence can be made between survival-time statistics in the SSR process and record statistics of independent and identically distributed random variables.

  19. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    SciTech Connect

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  20. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGESBeta

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  1. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  2. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  3. A flexible importance sampling method for integrating subgrid processes

    DOE PAGESBeta

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  4. A flexible importance sampling method for integrating subgrid processes

    SciTech Connect

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales.

    The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories.

    The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  5. A flexible importance sampling method for integrating subgrid processes

    DOE PAGESBeta

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  6. Detailed design of the GOSAT DHF at NIES and data acquisition/processing/distribution strategy

    NASA Astrophysics Data System (ADS)

    Watanabe, Hiroshi; Ishihara, Hironari; Hayashi, Kenji; Kawazoe, Fumie; Kikuchi, Nobuyuki; Eguchi, Nawo; Matsunaga, Tsuneo; Yokota, Tatsuya

    2008-10-01

    GOSAT Project (GOSAT stands for Greenhouse gases Observation SATellite) is a joint project of MOE (Ministry of the Environment), JAXA (Japan Aerospace Exploration Agency) and NIES (National Institute for Environmental Studies (NIES). Data acquired by TANSO-FTS (Fourier Transform Spectrometer) and TANSO-CAI (Cloud and Aerosol Imager) on GOSAT (TANSO stands for Thermal And Near infrared Sensor for carbon Observation) will be collected at Tsukuba Space Center @ JAXA. The level 1A and 1B data of FTS (interferogram and spectra, respectively) and the level 1A of CAI (uncorrected data) will be generated at JAXA and will be transferred to GOSAT Data Handling facility (DHF) at NIES for further processing. Radiometric and geometric correction will be applied to CAI L1A data to generate CAI L1B data. From CAI L1B data, cloud coverage and aerosol information (CAI Level 2 data) will be estimated. The FTS data that is recognized to have "low cloud coverage" by CAI will be processed to generate column concentration of carbon dioxide CO2 and methane CH4 (FTS Level 2 data). Level 3 data will be "global map column concentration" of green house gases averaged in time and space. Level 4 data will be global distribution of carbon source/sink model and re-calculated forward model estimated by inverse model. Major data flow will be also described. The Critical Design Review (CDR) of the DHF was completed in early July of 2007 to prepare the scheduled launch of GOSAT in early 2009. In this manuscript, major changes after the CDR are discussed. In addition, data acquisition scenario by FTS is also discussed. The data products can be searched and will be open to the public through GOSAT DHF after the data validation process. Data acquisition plan is also discussed and the discussion will cover lattice point observation for land area, and sun glint observation over water area. The Principal Investigators who submitted a proposal for Research Announcement will have a chance to request the

  7. A new approach to the optimisation of non-uniform sampling schedules for use in the rapid acquisition of 2D NMR spectra of small molecules.

    PubMed

    Sidebottom, Philip J

    2016-08-01

    Non-uniform sampling allows the routine, rapid acquisition of 2D NMR data. When the number of points in the NUS schedule is low, the quality of the data obtained is very dependent of the schedule used. A simple proceedure for finding optimium schedules has been developed and is demonstrated for the multiplicity edited HSQC experiment. PMID:27160788

  8. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    PubMed

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples. PMID:26695295

  9. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  10. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  11. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  12. A computational model associating learning process, word attributes, and age of acquisition.

    PubMed

    Hidaka, Shohei

    2013-01-01

    We propose a new model-based approach linking word learning to the age of acquisition (AoA) of words; a new computational tool for understanding the relationships among word learning processes, psychological attributes, and word AoAs as measures of vocabulary growth. The computational model developed describes the distinct statistical relationships between three theoretical factors underpinning word learning and AoA distributions. Simply put, this model formulates how different learning processes, characterized by change in learning rate over time and/or by the number of exposures required to acquire a word, likely result in different AoA distributions depending on word type. We tested the model in three respects. The first analysis showed that the proposed model accounts for empirical AoA distributions better than a standard alternative. The second analysis demonstrated that the estimated learning parameters well predicted the psychological attributes, such as frequency and imageability, of words. The third analysis illustrated that the developmental trend predicted by our estimated learning parameters was consistent with relevant findings in the developmental literature on word learning in children. We further discuss the theoretical implications of our model-based approach. PMID:24223699

  13. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  14. Real-time multilevel process monitoring and control of CR image acquisition and preprocessing for PACS and ICU

    NASA Astrophysics Data System (ADS)

    Zhang, Jianguo; Wong, Stephen T. C.; Andriole, Katherine P.; Wong, Albert W. K.; Huang, H. K.

    1996-05-01

    The purpose of this paper is to present a control theory and a fault tolerance algorithm developed for real time monitoring and control of acquisition and preprocessing of computed radiographs for PACS and Intensive Care Unit operations. This monitoring and control system uses the event-driven, multilevel processing approach to remove computational bottleneck and to improve system reliability. Its computational performance and processing reliability are evaluated and compared with those of the traditional, single level processing approach.

  15. Developmental trends in auditory processing can provide early predictions of language acquisition in young infants.

    PubMed

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R; Shao, Jie; Lozoff, Betsy

    2013-03-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory - Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  16. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    PubMed Central

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2012-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory – Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  17. A study of the influence of the data acquisition system sampling rate on the accuracy of measured acceleration loads for transport aircraft

    NASA Technical Reports Server (NTRS)

    Whitehead, Julia H.

    1992-01-01

    A research effort was initiated at National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC), to describe the relationship between the sampling rate and the accuracy of acceleration loads obtained from the data acquisition system of a transport aircraft. An accelerometer was sampled and digitized at a rate of 100 samples per second onboard a NASA Boeing 737 (B-737) flight research aircraft. Numerical techniques were used to reconstruct 2.5 hours of flight data into its original input waveform and then re-sample the waveform into rates of 4, 8, 16, and 32 samples per second. Peak-between-means counting technique and power spectral analysis were used to evaluate each sampling rate using the 32 samples per second data as the comparison. This paper presents the results from these methods and includes in appendix A, the peak-between-means counting results used in a general fatigue analysis for each of the sampling rates.

  18. An underground tale: contribution of microbial activity to plant iron acquisition via ecological processes

    PubMed Central

    Jin, Chong Wei; Ye, Yi Quan; Zheng, Shao Jian

    2014-01-01

    Background Iron (Fe) deficiency in crops is a worldwide agricultural problem. Plants have evolved several strategies to enhance Fe acquisition, but increasing evidence has shown that the intrinsic plant-based strategies alone are insufficient to avoid Fe deficiency in Fe-limited soils. Soil micro-organisms also play a critical role in plant Fe acquisition; however, the mechanisms behind their promotion of Fe acquisition remain largely unknown. Scope This review focuses on the possible mechanisms underlying the promotion of plant Fe acquisition by soil micro-organisms. Conclusions Fe-deficiency-induced root exudates alter the microbial community in the rhizosphere by modifying the physicochemical properties of soil, and/or by their antimicrobial and/or growth-promoting effects. The altered microbial community may in turn benefit plant Fe acquisition via production of siderophores and protons, both of which improve Fe bioavailability in soil, and via hormone generation that triggers the enhancement of Fe uptake capacity in plants. In addition, symbiotic interactions between micro-organisms and host plants could also enhance plant Fe acquisition, possibly including: rhizobium nodulation enhancing plant Fe uptake capacity and mycorrhizal fungal infection enhancing root length and the nutrient acquisition area of the root system, as well as increasing the production of Fe3+ chelators and protons. PMID:24265348

  19. Experiment kits for processing biological samples inflight on SLS-2

    NASA Technical Reports Server (NTRS)

    Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.

    1995-01-01

    This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.

  20. Toxic School Sites in Los Angeles: Weaknesses in the Site Acquisition Process. Special Report of the Joint Legislative Audit Committee.

    ERIC Educational Resources Information Center

    California State Legislature, Sacramento. Joint Legislative Audit Committee.

    A special report of the California Legislature's Joint Legislative Audit Committee addresses the school site acquisition process to attempt to discern how the system has allowed a minimum of nine Los Angeles public schools to be built on toxic lands. The report examines two such sites, the Jefferson Middle School (JMS) and the combined elementary…

  1. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  2. Combining Contextual and Morphemic Cues Is Beneficial during Incidental Vocabulary Acquisition: Semantic Transparency in Novel Compound Word Processing

    ERIC Educational Resources Information Center

    Brusnighan, Stephen M.; Folk, Jocelyn R.

    2012-01-01

    In two studies, we investigated how skilled readers use contextual and morphemic information in the process of incidental vocabulary acquisition during reading. In Experiment 1, we monitored skilled readers' eye movements while they silently read sentence pairs containing novel and known English compound words that were either semantically…

  3. The Comparative Effects of Processing Instruction and Dictogloss on the Acquisition of the English Passive by Speakers of Turkish

    ERIC Educational Resources Information Center

    Uludag, Onur; Vanpatten, Bill

    2012-01-01

    The current study presents the results of an experiment investigating the effects of processing instruction (PI) and dictogloss (DG) on the acquisition of the English passive voice. Sixty speakers of Turkish studying English at university level were assigned to three groups: one receiving PI, the other receiving DG and the third serving as a…

  4. The Effects of Word Exposure Frequency and Elaboration of Word Processing on Incidental L2 Vocabulary Acquisition through Reading

    ERIC Educational Resources Information Center

    Eckerth, Johannes; Tavakoli, Parveneh

    2012-01-01

    Research on incidental second language (L2) vocabulary acquisition through reading has claimed that repeated encounters with unfamiliar words and the relative elaboration of processing these words facilitate word learning. However, so far both variables have been investigated in isolation. To help close this research gap, the current study…

  5. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary. PMID:26930464

  6. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  7. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    SciTech Connect

    Alai, M; Glascoe, L; Love, A; Johnson, M; Einfeld, W

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) was to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.

  8. Parameter identification of process simulation models as a means for knowledge acquisition and technology transfer

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Ifanti, Konstantina

    2012-12-01

    Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.

  9. Micro-MRI-based image acquisition and processing system for assessing the response to therapeutic intervention

    NASA Astrophysics Data System (ADS)

    Vasilić, B.; Ladinsky, G. A.; Saha, P. K.; Wehrli, F. W.

    2006-03-01

    Osteoporosis is the cause of over 1.5 million bone fractures annually. Most of these fractures occur in sites rich in trabecular bone, a complex network of bony struts and plates found throughout the skeleton. The three-dimensional structure of the trabecular bone network significantly determines mechanical strength and thus fracture resistance. Here we present a data acquisition and processing system that allows efficient noninvasive assessment of trabecular bone structure through a "virtual bone biopsy". High-resolution MR images are acquired from which the trabecular bone network is extracted by estimating the partial bone occupancy of each voxel. A heuristic voxel subdivision increases the effective resolution of the bone volume fraction map and serves a basis for subsequent analysis of topological and orientational parameters. Semi-automated registration and segmentation ensure selection of the same anatomical location in subjects imaged at different time points during treatment. It is shown with excerpts from an ongoing clinical study of early post-menopausal women, that significant reduction in network connectivity occurs in the control group while the structural integrity is maintained in the hormone replacement group. The system described should be suited for large-scale studies designed to evaluate the efficacy of therapeutic intervention in subjects with metabolic bone disease.

  10. Meteoceanographic premises for structural design purposes in the Adriatic Sea: Acquisition and processing of data

    SciTech Connect

    Rampolli, M.; Biancardi, A.; Filippi, G. De

    1996-12-31

    In 1993 the leading international standards (ISO, APOI RP2A) for the design of offshore structures drastically changed the procedure for the definition of hydrodynamic forces. In particular oil companies are required to have a detailed knowledge of the weather of the areas where they operate, if they want to maintain the previous results. Alternatively, more conservative hydrodynamic forces must be considered in the design phase. Such an increase, valuable in 20--30% of total hydrodynamic force, means heavier platform structures in new projects, and more critical elements to be inspected in existing platforms. In 1992, in order to have more reliable and safe transports to and from the platforms, Agip installed a meteo-marine sensor network in Adriatic Sea, on 13 of the over 80 producing platforms. Data collected are sent to shore via radio and operators can use real time data or 12-hour wave forecast, obtained by a statistic forecasting model. Taking advantage by such existing instruments, a project was undertaken in 1993 with the purpose of determining the extreme environmental parameters to be used by structural engineers. The network has been upgraded in order to achieve directional information of the waves and to permit short term analysis. This paper describes the data acquisition system, data processing and the achieved results.

  11. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA

  12. Adaptive Sampling for Learning Gaussian Processes Using Mobile Sensor Networks

    PubMed Central

    Xu, Yunfei; Choi, Jongeun

    2011-01-01

    This paper presents a novel class of self-organizing sensing agents that adaptively learn an anisotropic, spatio-temporal Gaussian process using noisy measurements and move in order to improve the quality of the estimated covariance function. This approach is based on a class of anisotropic covariance functions of Gaussian processes introduced to model a broad range of spatio-temporal physical phenomena. The covariance function is assumed to be unknown a priori. Hence, it is estimated by the maximum a posteriori probability (MAP) estimator. The prediction of the field of interest is then obtained based on the MAP estimate of the covariance function. An optimal sampling strategy is proposed to minimize the information-theoretic cost function of the Fisher Information Matrix. Simulation results demonstrate the effectiveness and the adaptability of the proposed scheme. PMID:22163785

  13. System design, development, and production process modeling: A versatile and powerful acquisition management decision support tool

    SciTech Connect

    Rafuse, H.E.

    1996-12-31

    A series of studies have been completed on the manufacturing operations of light, medium, and heavy tactical vehicle system producers to facilitate critical system acquisition resource decisions by the United States Army Program Executive Officer, Tactical Wheeled Vehicles. The principal programs were the Family of Medium Tactical Vehicles (FMTV) production programs at Stewart & Stevenson Services, Inc.; the heavy TWV production programs at the Oshkosh Truck Corporation in Oshkosh, Wisconsin; and the light TWV and 2.5 ton remanufacturing production programs at the AM General Corporation in South Bend, Indiana. Each contractor`s production scenarios were analyzed and modeled to accurately quantify the relationship between production rates and unit costs. Specific objectives included identifying (1) Minimum Sustaining Rates to support current and future budgetary requirements and resource programming for potential follow-on procurements, (2) thresholds where production rate changes significantly affect unit costs, and (3) critical production program factors and their impacts to production rate versus unit cost relationships. Two different techniques were utilized initially in conducting the analyses. One technique principally focused on collecting and analyzing applicable historical production program information, where available, to develop a statistical predictive model. A second and much more exhaustive technique focused on a detailed modeling of each contractor`s production processes, flows, and operations. A standard architecture of multiple linked functional modules was used for each process model. Using the standard architecture, the individual modules were tailored to specific contractor operations. Each model contains detailed information on manpower, burden rates, material, material price/quantity relationships, capital, manufacturing support, program management, and all related direct and indirect costs applicable to the production programs.

  14. The OSIRIS-REx Mission Sample Site Selection Process

    NASA Astrophysics Data System (ADS)

    Beshore, Edward C.; Lauretta, Dante

    2014-11-01

    In September of 2016, the OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, Security, REgolith eXplorer) spacecraft will depart for asteroid (101955) Bennu, and in doing so, will turn an important corner in the exploration of the solar system. After arriving at Bennu in the fall of 2018, OSIRIS-REx will undertake a program of observations designed to select a site suitable for retrieving a sample that will be returned to the Earth in 2023. The third mission in NASA’s New Frontiers program, OSIRIS-REx will return over 60 grams from Bennu’s surface.OSIRIS-REx is unique because the science team will have an operational role to play in preparing data products needed to select a sample site. These include products used to ensure flight system safety — topographic maps and shape models, temperature measurements, maps of hazards — as well as assessments of sampleability and science value. The timing and production of these will be presented, as will the high-level decision-making tools and processes for the interim and final site selection processes.

  15. Proceedings of the XIIIth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition, and Processing

    USGS Publications Warehouse

    Love, Jeffrey J.

    2009-01-01

    The thirteenth biennial International Association of Geomagnetism and Aeronomy (IAGA) Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing was held in the United States for the first time on June 9-18, 2008. Hosted by the U.S. Geological Survey's (USGS) Geomagnetism Program, the workshop's measurement session was held at the Boulder Observatory and the scientific session was held on the campus of the Colorado School of Mines in Golden, Colorado. More than 100 participants came from 36 countries and 6 continents. Preparation for the workshop began when the USGS Geomagnetism Program agreed, at the close of the twelfth workshop in Belsk Poland in 2006, to host the next workshop. Working under the leadership of Alan Berarducci, who served as the chairman of the local organizing committee, and Tim White, who served as co-chairman, preparations began in 2007. The Boulder Observatory was extensively renovated and additional observation piers were installed. Meeting space on the Colorado School of Mines campus was arranged, and considerable planning was devoted to managing the many large and small issues that accompany an international meeting. Without the devoted efforts of both Alan and Tim, other Geomagnetism Program staff, and our partners at the Colorado School of Mines, the workshop simply would not have occurred. We express our thanks to Jill McCarthy, the USGS Central Region Geologic Hazards Team Chief Scientist; Carol A. Finn, the Group Leader of the USGS Geomagnetism Program; the USGS International Office; and Melody Francisco of the Office of Special Programs and Continuing Education of the Colorado School of Mines. We also thank the student employees that the Geomagnetism Program has had over the years and leading up to the time of the workshop. For preparation of the proceedings, thanks go to Eddie and Tim. And, finally, we thank our sponsors, the USGS, IAGA, and the Colorado School of Mines.

  16. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  17. SAMPLING DEVICE FOR pH MEASUREMENT IN PROCESS STREAMS

    DOEpatents

    Michelson, C.E.; Carson, W.N. Jr.

    1958-11-01

    A pH cell is presented for monitoring the hydrogen ion concentration of a fluid in a process stream. The cell is made of glass with a side entry arm just above a reservoir in which the ends of a glass electrode and a reference electrode are situated. The glass electrode contains the usual internal solution which is connected to a lead. The reference electrode is formed of saturated calomel having a salt bridge in its bottom portion fabricated of a porous glass to insure low electrolyte flow. A flush tube leads into the cell through which buffer and flush solutions are introduced. A ground wire twists about both electrode ends to insure constant electrical grounding of the sample. The electrode leads are electrically connected to a pH meter of aay standard type.

  18. Sampling Designs in Qualitative Research: Making the Sampling Process More Public

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.

    2007-01-01

    The purpose of this paper is to provide a typology of sampling designs for qualitative researchers. We introduce the following sampling strategies: (a) parallel sampling designs, which represent a body of sampling strategies that facilitate credible comparisons of two or more different subgroups that are extracted from the same levels of study;…

  19. MoonRise: A US Robotic Sample-Return Mission to Address Solar System Wide Processes

    NASA Astrophysics Data System (ADS)

    Jolliff, Bradley; Warren, P. H.; Shearer, C. K.; Alkalai, L.; Papanastassiou, D. A.; Huertas, A.; MoonRise Team

    2010-10-01

    The MoonRise lunar sample-return mission is currently funded to perform a Phase A Concept Study as part of NASA's New Frontiers Program. Exploration of the great (d = 2500 km) South Pole-Aitken basin has been assigned high priority in several NRC reports. MoonRise would be the first US robotic sample-return mission from another planetary surface. Key strengths of the MoonRise mission include: 1. Most importantly, MoonRise will sample the SPA basin's interior on the Moon's southern far side, instead of the same small region near the center of the near side as all previous (Apollo and Luna) sampling missions. Science objectives for the SPA sample-return mission fall into three main categories: (1) testing the impact cataclysm hypothesis, with its profound implications for the evolution of the Solar System and for life on the Earth at 3.9 Ga; (2) constraining basin-scale impact processes; and (3) constraining how the Moon's interior varies laterally on a global scale, and with depth on a scale of many tens of kilometers; and thus how the lunar crust formed and evolved. 2. MoonRise will greatly enhance scientific return by using a sieving mechanism to concentrate small rock fragments. As an example, for rocks ɳ mm in size (minimum dimension) and a target regolith of approximately average grain-size distribution, the acquisition yield will be improved by a factor of 50. 3. MoonRise will obtain a total of at least one kilogram of lunar material, including 100 g of bulk, unsieved soil for comparison with remote sensing data. 4. MoonRise will exploit data from LRO, Kaguya, Chandrayaan-1, and other recent remote-sensing missions, in particular LRO's Narrow Angle Camera (NAC), to ensure a safe landing by avoidance of areas with abundant boulders, potentially hazardous craters, and/or high slopes mapped from high resolution stereo images.

  20. Models, Processes, Principles, and Strategies: Second Language Acquisition in and out of the Classroom.

    ERIC Educational Resources Information Center

    Andersen, Roger W.

    1988-01-01

    A discussion of research on naturalistic second language acquisition (SLA) focuses on its relationship to the foreign language classroom context. It is argued that to attempt to relate natural SLA to classroom foreign language learning (FLL), a coherent and consistent theoretical framework is needed. The Cognitive-Interactionist Model is developed…

  1. The Representation and Processing of Familiar Faces in Dyslexia: Differences in Age of Acquisition Effects

    ERIC Educational Resources Information Center

    Smith-Spark, James H.; Moore, Viv

    2009-01-01

    Two under-explored areas of developmental dyslexia research, face naming and age of acquisition (AoA), were investigated. Eighteen dyslexic and 18 non-dyslexic university students named the faces of 50 well-known celebrities, matched for facial distinctiveness and familiarity. Twenty-five of the famous people were learned early in life, while the…

  2. Directed Blogging with Community College ESL Students: Its Effects on Awareness of Language Acquisition Processes

    ERIC Educational Resources Information Center

    Johnson, Cathy

    2012-01-01

    English as a Second Language (ESL) students often have problems progressing in their acquisition of the language and frequently do not know how to solve this dilemma. Many of them think of their second language studies as just another school subject that they must pass in order to move on to the next level, so few of them realize the metacognitive…

  3. Optionality in Second Language Acquisition: A Generative, Processing-Oriented Account

    ERIC Educational Resources Information Center

    Truscott, John

    2006-01-01

    The simultaneous presence in a learner's grammar of two features that should be mutually exclusive (optionality) typifies second language acquisition. But generative approaches have no good means of accommodating the phenomenon. The paper proposes one approach, based on Truscott and Sharwood Smith's (2004) MOGUL framework. In this framework,…

  4. Laser velocimeter data acquisition and real time processing using a microcomputer

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1988-01-01

    An evolutionary data acquisition system for laser velocimeter applications is presented. The system uses a laser velocimeter (autocovariance) buffer interface to acquire the data, a WORM optical disk for storage, and a high-speed microcomputer for real time statistical computations.

  5. Input-Based Tasks and the Acquisition of Vocabulary and Grammar: A Process-Product Study

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2012-01-01

    The study reported in this article investigated the use of input-based tasks with young, beginner learners of English as a second language by examining both learning outcomes (i.e. acquisition) and the interactions that resulted from implementing the tasks. The participants were 15 learners, aged six, with no experience of second language (L2)…

  6. Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector

    NASA Astrophysics Data System (ADS)

    Jackson, M. E.; Holub, K.; Callahan, W.; Blatt, S.

    2014-12-01

    In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from GPS stations located near NOAA Radiosonde Observation (Upper-Air Observation) launch sites. A success metric was established that requires Trimble's PWV estimates to match ESRL/GSD's to within 1.5 mm 95% of the time, which corresponds to a ZTD uncertainty of less than 10 mm 95% of the time. Initial results indicate that Trimble/ENI data meet and exceed the ZTD metric, but for some stations PWV estimates are out of specification. These discrepancies are primarily due to how offsets between MET and GPS stations are handled and are easily resolved. Additional test networks are proposed that include low terrain/high moisture variability stations, high terrain/low moisture variability stations, as well as high terrain/high moisture variability stations. We will present results from further testing along with a timeline

  7. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  8. Recent Results of the Investigation of a Microfluidic Sampling Chip and Sampling System for Hot Cell Aqueous Processing Streams

    SciTech Connect

    Julia Tripp; Jack Law; Tara Smith

    2013-10-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and microfluidics sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The microfluidic-based robotic sampling system’s mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of microfluidic sampling chips.

  9. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  10. Integrating data acquisition and offline processing systems for small experiments at Fermilab

    SciTech Connect

    Streets, J.; Corbin, B.; Taylor, C.

    1995-10-01

    Two small experiments at Fermilab are using the large UNIX central computing facility at Fermilab (FNALU) to analyze data. The data acquisition systems are based on {open_quotes}off the shelf{close_quotes} software packages utilizing VAX/VMS computers and CAMAC readout. As the disk available on FNALU approaches the sizes of the raw data sets taken by the experiments (50 Gbytes) we have used the Andrew File System (AFS) to serve the data to experimenters for analysis.

  11. DIADEM--a system for the interactive data acquisition and processing in an analytical laboratory.

    PubMed

    Peters, F; Teschner, W

    1979-09-01

    A conversational program for the acquisition of experimental data in a multi-user, multi-instrument computer system is described. It assists the researcher when recording on-time data. Due to the simple structure of the dialogue, no special knowledge of computer handling is required by the experimenter. Whereas the experimental methods are versatile, a uniform concept of the dialogue and the file structure is realized. PMID:487779

  12. Hippocampal Context Processing during Acquisition of a Predictive Learning Task Is Associated with Renewal in Extinction Recall.

    PubMed

    Lissek, Silke; Glaubitz, Benjamin; Schmidt-Wilcke, Tobias; Tegenthoff, Martin

    2016-05-01

    Renewal is defined as the recovery of an extinguished response if extinction and retrieval contexts differ. The context dependency of extinction, as demonstrated by renewal, has important implications for extinction-based therapies. Persons showing renewal (REN) exhibit higher hippocampal activation during extinction in associative learning than those without renewal (NOREN), demonstrating hippocampal context processing, and recruit ventromedial pFC in retrieval. Apart from these findings, brain processes generating renewal remain largely unknown. Conceivably, processing differences in task-relevant brain regions that ultimately lead to renewal may occur already in initial acquisition of associations. Therefore, in two fMRI studies, we investigated overall brain activation and hippocampal activation in REN and NOREN during acquisition of an associative learning task in response to presentation of a context alone or combined with a cue. Results of two studies demonstrated significant activation differences between the groups: In Study 1, a support vector machine classifier correctly assigned participants' brain activation patterns to REN and NOREN groups, respectively. In Study 2, REN and NOREN showed similar hippocampal involvement during context-only presentation, suggesting processing of novelty, whereas overall hippocampal activation to the context-cue compound, suggesting compound encoding, was higher in REN. Positive correlations between hippocampal activation and renewal level indicated more prominent hippocampal processing in REN. Results suggest that hippocampal processing of the context-cue compound rather than of context only during initial learning is related to a subsequent renewal effect. Presumably, REN participants use distinct encoding strategies during acquisition of context-related tasks, which reflect in their brain activation patterns and contribute to a renewal effect. PMID:26807840

  13. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  14. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  15. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  16. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 4 2014-04-01 2014-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  17. Automated ground data acquisition and processing system for calibration and performance assessment of the EO-1 Advanced Land Imager

    NASA Astrophysics Data System (ADS)

    Viggh, Herbert E. M.; Mendenhall, Jeffrey A.; Sayer, Ronald W.; Stuart, J. S.; Gibbs, Margaret D.

    1999-09-01

    The calibration and performance assessment of the Earth Observing-1 (EO-1) Advanced Land Imager (ALI) required a ground data system for acquiring and processing ALI data. In order to meet tight schedule and budget requirements, an automated system was developed that could be run by a single operator. This paper describes the overall system and the individual Electrical Ground Support Equipment (EGSE) and computer components used. The ALI Calibration Control Node (ACCN) serves as a test executive with a single graphical user interface to the system, controlling calibration equipment and issuing data acquisition and processing requests to the other EGSE and computers. EGSE1, a custom data acquisition syste, collects ALI science data and also passes ALI commanding and housekeeping telemetry collection requests to EGSE2 and EGSE3 which are implemented on an ASIST workstation. The performance assessment machine, stores and processes collected ALI data, automatically displaying quick-look processing results. The custom communications protocol developed to interface these various machines and to automate their interactions is described, including the various modes of operation needed to support spatial, radiometric, spectral, and functional calibration and performance assessment of the ALI.

  18. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  19. Fast nearly ML estimation of Doppler frequency in GNSS signal acquisition process.

    PubMed

    Tang, Xinhua; Falletti, Emanuela; Lo Presti, Letizia

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  20. Fast Nearly ML Estimation of Doppler Frequency in GNSS Signal Acquisition Process

    PubMed Central

    Tang, Xinhua; Falletti, Emanuela; Presti, Letizia Lo

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  1. Mechanical Abrasion as a Low Cost Technique for Contamination-Free Sample Acquisition from a Category IVA Clean Platform

    NASA Technical Reports Server (NTRS)

    Dolgin, B.; Yarbrough, C.; Carson, J.; Troy, R.

    2000-01-01

    The proposed Mars Sample Transfer Chain Architecture provides Planetary Protection Officers with clean samples that are required for the eventual release from confinement of the returned Martian samples. At the same time, absolute cleanliness and sterility requirement is not placed of any part of the Lander (including the deep drill), Mars Assent Vehicle (MAV), any part of the Orbiting Sample container (OS), Rover mobility platform, any part of the Minicorer, Robotic arm (including instrument sensors), and most of the caching equipment on the Rover. The removal of the strict requirements in excess of the Category IVa cleanliness (Pathfinder clean) is expected to lead to significant cost savings. The proposed architecture assumes that crosscontamination renders all surfaces in the vicinity of the rover(s) and the lander(s) contaminated. Thus, no accessible surface of Martian rocks and soil is Earth contamination free. As a result of the latter, only subsurface samples (either rock or soil) can be and will be collected for eventual return to Earth. Uncontaminated samples can be collected from a Category IVa clean platform. Both subsurface soil and rock samples can be maintained clean if they are collected by devices that are self-contained and clean and sterile inside only. The top layer of the sample is removed in a manner that does not contaminate the collection tools. Biobarrier (e.g., aluminum foil) covering the moving parts of these devices may be used as the only self removing bio-blanket that is required. The samples never leave the collection tools. The lids are placed on these tools inside the collection device. These single use tools with the lid and the sample inside are brought to Earth in the OS. The lids have to be designed impenetrable to the Earth organisms. The latter is a well established art.

  2. ACQUISITION AND ANALYSIS OF GROUNDWATER/AQUIFER SAMPLES: CURRENT TECHNOLOGY AND THE TRADE OFF BETWEEN QUALITY ASSURANCE AND PRACTICAL CONSIDERATIONS

    EPA Science Inventory

    In the migration of a high-organic-carbon-content landfill leachate through the subsurface environment, the mobility of inorganic contaminants can be seriously influenced by oxidation-reduction, complexation, precipitation and adsorption processes. hese processes, in turn, depend...

  3. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  4. A strong-motion network in Northern Italy (RAIS): data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Augliera, Paolo; Ezio, D'alema; Simone, Marzorati; Marco, Massa

    2010-05-01

    The necessity of a dense network in Northern Italy started from the lack of available data after the occurrence of the 24th November 2004, Ml 5.2, Salò earthquake. Since 2006 many efforts have been made by the INGV (Italian National Institute for Geophysics and Volcanology), department of Milano-Pavia (hereinafter INGV MI-PV), to improve the strong-motion monitoring of the Northern Italy regions. At the end of 2007, the RAIS (Strong-Motion Network in Northern Italy) included 19 stations equipped with Kinemetrics Episensor FBA ES-T coupled with 5 20-bits Lennartz Mars88/MC and 14 24-bits Reftek 130-01 seismic recorders. In this step, we achieved the goal to reduce the average inter-distances between strong-motion stations, installed in the area under study, from about 40 km to 15 km. In this period the GSM-modem connection between the INGV MI-PV acquisition center and the remote stations was used. Starting to 2008, in order to assure real-time recordings, with the aim to integrate RAIS data in the calculation of the Italian ground-shaking maps, the main activity was devoted to update the data acquisition of the RAIS strong-motion network. Moreover a phase that will lead to replace the original recorders with 24-bits GAIA2 systems (directly produced by INGV-CNT laboratory, Rome) has been starting. Today 11 out of the 22 stations are already equipped by GAIA2 and their original GSM-modem acquisition system were already replaced with real-time connections, based on TCP/IP or Wi-Fi links. All real time stations storage data using the MiniSEED format. The management and data exchange are assured by the SEED-Link and Earthworm packages. The metadata dissemination is achieved through the website, where the computed strong motion parameters, together the amplification functions, for each recording station are available for each recorded events. The waveforms, for earthquake with local magnitude higher than 3.0 are now collected in the ITalian ACcelerometric Archive (http://itaca.mi.ingv.it).

  5. EARLY SYNTACTIC ACQUISITION.

    ERIC Educational Resources Information Center

    KELLEY, K.L.

    THIS PAPER IS A STUDY OF A CHILD'S EARLIEST PRETRANSFORMATIONAL LANGUAGE ACQUISITION PROCESSES. A MODEL IS CONSTRUCTED BASED ON THE ASSUMPTIONS (1) THAT SYNTACTIC ACQUISITION OCCURS THROUGH THE TESTING OF HYPOTHESES REFLECTING THE INITIAL STRUCTURE OF THE ACQUISITION MECHANISM AND THE LANGUAGE DATA TO WHICH THE CHILD IS EXPOSED, AND (2) THAT…

  6. Programme evaluation training for health professionals in francophone Africa: process, competence acquisition and use

    PubMed Central

    Ridde, Valéry; Fournier, Pierre; Banza, Baya; Tourigny, Caroline; Ouédraogo, Dieudonné

    2009-01-01

    Background While evaluation is, in theory, a component of training programmes in health planning, training needs in this area remain significant. Improving health systems necessarily calls for having more professionals who are skilled in evaluation. Thus, the Université de Ouagadougou (Burkina Faso) and the Université de Montréal (Canada) have partnered to establish, in Burkina Faso, a master's-degree programme in population and health with a course in programme evaluation. This article describes the four-week (150-hour) course taken by two cohorts (2005–2006/2006–2007) of health professionals from 11 francophone African countries. We discuss how the course came to be, its content, its teaching processes and the master's programme results for students. Methods The conceptual framework was adapted from Kirkpatrick's (1996) four-level evaluation model: reaction, learning, behaviour, results. Reaction was evaluated based on a standardized questionnaire for all the master's courses and lessons. Learning and behaviour competences were assessed by means of a questionnaire (pretest/post-test, one year after) adapted from the work of Stevahn L, King JA, Ghere G, Minnema J: Establishing Essential Competencies for Program Evaluators. Am J Eval 2005, 26(1):43–59. Master's programme effects were tested by comparing the difference in mean scores between times (before, after, one year after) using pretest/post-test designs. Paired sample tests were used to compare mean scores. Results The teaching is skills-based, interactive and participative. Students of the first cohort gave the evaluation course the highest score (4.4/5) for overall satisfaction among the 16 courses (3.4–4.4) in the master's programme. What they most appreciated was that the forms of evaluation were well adapted to the content and format of the learning activities. By the end of the master's programme, both cohorts of students considered that they had greatly improved their mastery of the 60

  7. The Blanco Cosmology Survey: Data Acquisition, Processing, Calibration, Quality Diagnostics and Data Release

    SciTech Connect

    Desai, S.; Armstrong, R.; Mohr, J.J.; Semler, D.R.; Liu, J.; Bertin, E.; Allam, S.S.; Barkhouse, W.A.; Bazin, G.; Buckley-Geer, E.J.; Cooper, M.C.; /UC, Irvine /Lick Observ. /UC, Santa Cruz

    2012-04-01

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha},{delta})= (5 hr, -55{sup circ} and 23 hr, -55{sup circ}). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out PSF corrected model fitting photometry for all detected objects. The median 10{sigma} galaxy (point source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6) and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 milli-arcsec. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from 2MASS which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematics floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7% and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta} z/(1+z)=0.054 with an outlier fraction {eta}<5% to z{approx}1. We highlight some selected science results to date and provide a full description of the released data products.

  8. THE BLANCO COSMOLOGY SURVEY: DATA ACQUISITION, PROCESSING, CALIBRATION, QUALITY DIAGNOSTICS, AND DATA RELEASE

    SciTech Connect

    Desai, S.; Mohr, J. J.; Semler, D. R.; Liu, J.; Bazin, G.; Zenteno, A.; Armstrong, R.; Bertin, E.; Allam, S. S.; Buckley-Geer, E. J.; Lin, H.; Tucker, D.; Barkhouse, W. A.; Cooper, M. C.; Hansen, S. M.; High, F. W.; Lin, Y.-T.; Ngeow, C.-C.; Rest, A.; Song, J.

    2012-09-20

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha}, {delta}) = (5 hr, -55 Degree-Sign ) and (23 hr, -55 Degree-Sign ). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4 m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out point-spread function-corrected model-fitting photometry for all detected objects. The median 10{sigma} galaxy (point-source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6), and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 mas. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from the Two Micron All Sky Survey, which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematic floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7%, and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier spread{sub m}odel produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta}z/(1 + z) = 0.054 with an outlier fraction {eta} < 5% to z {approx} 1. We highlight some selected science results to date and provide a full description of the released data products.

  9. The Use of Work Samples in the Vocational Evaluation Process.

    ERIC Educational Resources Information Center

    Nadolsky, Julian M.

    1981-01-01

    The author suggests that much of the current vocational evaluation technology does not fit within the definition of a work sample, and warns that if evaluators do not integrate their measurement tools with the discipline's philosophy, system publishers will gain control over vocational evaluation. (CL)

  10. 27 CFR 555.184 - Statements of process and samples.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... regard to any plastic explosive or to any detection agent that is to be introduced into a plastic explosive or formulated in such plastic explosive shall be submitted by a licensed manufacturer or licensed importer, upon request, to the Director. (b) Samples of any plastic explosive or detection agent shall...

  11. The effect of signal acquisition and processing choices on ApEn values: towards a "gold standard" for distinguishing effort levels from isometric force records.

    PubMed

    Forrest, Sarah M; Challis, John H; Winter, Samantha L

    2014-06-01

    Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. PMID:24725708

  12. The origins of age of acquisition and typicality effects: Semantic processing in aphasia and the ageing brain.

    PubMed

    Räling, Romy; Schröder, Astrid; Wartenburger, Isabell

    2016-06-01

    Age of acquisition (AOA) has frequently been shown to influence response times and accuracy rates in word processing and constitutes a meaningful variable in aphasic language processing, while its origin in the language processing system is still under debate. To find out where AOA originates and whether and how it is related to another important psycholinguistic variable, namely semantic typicality (TYP), we studied healthy, elderly controls and semantically impaired individuals using semantic priming. For this purpose, we collected reaction times and accuracy rates as well as event-related potential data in an auditory category-member-verification task. The present results confirm a semantic origin of TYP, but question the same for AOA while favouring its origin at the phonology-semantics interface. The data are further interpreted in consideration of recent theories of ageing. PMID:27106392

  13. Data acquisition techniques for exploiting the uniqueness of the time-of-flight mass spectrometer: Application to sampling pulsed gas systems

    NASA Technical Reports Server (NTRS)

    Lincoln, K. A.

    1980-01-01

    Mass spectra are produced in most mass spectrometers by sweeping some parameter within the instrument as the sampled gases flow into the ion source. It is evident that any fluctuation in the gas during the sweep (mass scan) of the instrument causes the output spectrum to be skewed in its mass peak intensities. The time of flight mass spectrometer (TOFMS) with its fast, repetitive mode of operation produces spectra without skewing or varying instrument parameters and because all ion species are ejected from the ion source simultaneously, the spectra are inherently not skewed despite rapidly changing gas pressure or composition in the source. Methods of exploiting this feature by utilizing fast digital data acquisition systems, such as transient recorders and signal averagers which are commercially available are described. Applications of this technique are presented including TOFMS sampling of vapors produced by both pulsed and continuous laser heating of materials.

  14. GUIDELINES FOR PARTICULATE SAMPLING IN GASEOUS EFFLUENTS FROM INDUSTRIAL PROCESSES

    EPA Science Inventory

    The report lists and briefly describes many instruments and techniques used to measure the concentration or size distribution of particles suspended in process streams. Standard (well established) methods are described, as well as some experimental methods and prototype instrumen...

  15. Impact of low intensity summer rainfall on E. coli-discharge event dynamics with reference to sample acquisition and storage.

    PubMed

    Oliver, David M; Porter, Kenneth D H; Heathwaite, A Louise; Zhang, Ting; Quilliam, Richard S

    2015-07-01

    Understanding the role of different rainfall scenarios on faecal indicator organism (FIO) dynamics under variable field conditions is important to strengthen the evidence base on which regulators and land managers can base informed decisions regarding diffuse microbial pollution risks. We sought to investigate the impact of low intensity summer rainfall on Escherichia coli-discharge (Q) patterns at the headwater catchment scale in order to provide new empirical data on FIO concentrations observed during baseflow conditions. In addition, we evaluated the potential impact of using automatic samplers to collect and store freshwater samples for subsequent microbial analysis during summer storm sampling campaigns. The temporal variation of E. coli concentrations with Q was captured during six events throughout a relatively dry summer in central Scotland. The relationship between E. coli concentration and Q was complex with no discernible patterns of cell emergence with Q that were repeated across all events. On several occasions, an order of magnitude increase in E. coli concentrations occurred even with slight increases in Q, but responses were not consistent and highlighted the challenges of attempting to characterise temporal responses of E. coli concentrations relative to Q during low intensity rainfall. Cross-comparison of E. coli concentrations determined in water samples using simultaneous manual grab and automated sample collection was undertaken with no difference in concentrations observed between methods. However, the duration of sample storage within the autosampler unit was found to be more problematic in terms of impacting on the representativeness of microbial water quality, with unrefrigerated autosamplers exhibiting significantly different concentrations of E. coli relative to initial samples after 12-h storage. The findings from this study provide important empirical contributions to the growing evidence base in the field of catchment microbial

  16. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  17. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    ERIC Educational Resources Information Center

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  18. Communication Barriers in Quality Process: Sakarya University Sample

    ERIC Educational Resources Information Center

    Yalcin, Mehmet Ali

    2012-01-01

    Communication has an important role in life and especially in education. Nowadays, lots of people generally use technology for communication. When technology uses in education and other activities, there may be some communication barriers. And also, quality process has an important role in higher education institutes. If a higher education…

  19. Acquisition process of typing skill using hierarchical materials in the Japanese language.

    PubMed

    Ashitaka, Yuki; Shimada, Hiroyuki

    2014-08-01

    In the present study, using a new keyboard layout with only eight keys, we conducted typing training for unskilled typists. In this task, Japanese college students received training in typing words consisting of a pair of hiragana characters with four keystrokes, using the alphabetic input method, while keeping the association between the keys and typists' finger movements; the task was constructed so that chunking was readily available. We manipulated the association between the hiragana characters and alphabet letters (hierarchical materials: overlapped and nonoverlapped mappings). Our alphabet letter materials corresponded to the regular order within each hiragana word (within the four letters, the first and third referred to consonants, and the second and fourth referred to vowels). Only the interkeystroke intervals involved in the initiation of typing vowel letters showed an overlapping effect, which revealed that the effect was markedly large only during the early period of skill development (the effect for the overlapped mapping being larger than that for the nonoverlapped mapping), but that it had diminished by the time of late training. Conversely, the response time and the third interkeystroke interval, which are both involved in the latency of typing a consonant letter, did not reveal an overlapped effect, suggesting that chunking might be useful with hiragana characters rather than hiragana words. These results are discussed in terms of the fan effect and skill acquisition. Furthermore, we discuss whether there is a need for further research on unskilled and skilled Japanese typists. PMID:24874261

  20. Lexical processing and organization in bilingual first language acquisition: Guiding future research.

    PubMed

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-06-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between 2 languages in early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. (PsycINFO Database Record PMID:26866430

  1. Lexical Processing and Organization in Bilingual First Language Acquisition: Guiding Future Research

    PubMed Central

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-01-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between two languages in the early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. PMID:26866430

  2. Optimization of the development process for air sampling filter standards

    NASA Astrophysics Data System (ADS)

    Mena, RaJah Marie

    Air monitoring is an important analysis technique in health physics. However, creating standards which can be used to calibrate detectors used in the analysis of the filters deployed for air monitoring can be challenging. The activity of a standard should be well understood, this includes understanding how the location within the filter affects the final surface emission rate. The purpose of this research is to determine the parameters which most affect uncertainty in an air filter standard and optimize these parameters such that calibrations made with them most accurately reflect the true activity contained inside. A deposition pattern was chosen from literature to provide the best approximation of uniform deposition of material across the filter. Samples sets were created varying the type of radionuclide, amount of activity (high activity at 6.4 -- 306 Bq/filter and one low activity 0.05 -- 6.2 Bq/filter, and filter type. For samples analyzed for gamma or beta contaminants, the standards created with this procedure were deemed sufficient. Additional work is needed to reduce errors to ensure this is a viable procedure especially for alpha contaminants.

  3. Determination of pesticides and their metabolites in processed cereal samples.

    PubMed

    González-Curbelo, M Á; Hernández-Borges, J; Borges-Miquel, T M; Rodríguez-Delgado, M Á

    2012-01-01

    Fifteen pesticides including some of their metabolites (disulfoton sulfoxide, ethoprophos, cadusafos, dimethoate, terbufos, disulfoton, chlorpyrifos-methyl, malaoxon, fenitrothion, pirimiphos-methyl, malathion, chlorpyrifos, terbufos sulfone, disulfoton sulfone and fensulfothion) were analysed in milled toasted wheat and maize as well as in wheat flour and baby cereals. The QuEChERS (quick, easy, cheap, effective, rugged and safe) methodology was used and its dispersive solid-phase extraction procedure was optimised by means of an experimental design with the aim of reducing the amount of co-extracted lipids and obtaining a clean extract. Gas chromatography with nitrogen phosphorus detection were used as the separation and detection techniques, respectively. The method was validated in terms of selectivity, recoveries, calibration, precision and accuracy as well as matrix effects. Limits of detection were between 0.07 and 34.8 µg kg(-1) with recoveries in the range of 71-110% (relative standard deviations were below 9%). A total of 40 samples of different origin were analysed. Residues of pirimiphos-methyl were found in six of the samples at concentrations in the range 0.08-0.47 mg kg(-1), which were below the MRLs established for this pesticide in cereal grains. Tandem mass spectrometry confirmation was also carried out in order to identify unequivocally the presence of this pesticide. PMID:22043870

  4. Robotic Arm Manipulator Using Active Control for Sample Acquisition and Transfer, and Passive Mode for Surface Compliance

    NASA Technical Reports Server (NTRS)

    Liu, Jun; Underhill, Michael L.; Trease, Brian P.; Lindemann, Randel A.

    2010-01-01

    A robotic arm that consists of three joints with four degrees of freedom (DOF) has been developed. It can carry an end-effector to acquire and transfer samples by using active control and comply with surface topology in a passive mode during a brief surface contact. The three joints are arranged in such a way that one joint of two DOFs is located at the shoulder, one joint of one DOF is located at the elbow, and one joint of one DOF is located at the wrist. Operationally, three DOFs are moved in the same plane, and the remaining one on the shoulder is moved perpendicular to the other three for better compliance with ground surface and more flexibility of sample handling. Three out of four joints are backdriveable, making the mechanism less complex and more cost effective

  5. The approach to sample acquisition and its impact on the derived human fecal microbiome and VOC metabolome.

    PubMed

    Couch, Robin D; Navarro, Karl; Sikaroodi, Masoumeh; Gillevet, Pat; Forsyth, Christopher B; Mutlu, Ece; Engen, Phillip A; Keshavarzian, Ali

    2013-01-01

    Recent studies have illustrated the importance of the microbiota in maintaining a healthy state, as well as promoting disease states. The intestinal microbiota exerts its effects primarily through its metabolites, and metabolomics investigations have begun to evaluate the diagnostic and health implications of volatile organic compounds (VOCs) isolated from human feces, enabled by specialized sampling methods such as headspace solid-phase microextraction (hSPME). The approach to stool sample collection is an important consideration that could potentially introduce bias and affect the outcome of a fecal metagenomic and metabolomic investigation. To address this concern, a comparison of endoscopically collected (in vivo) and home collected (ex vivo) fecal samples was performed, revealing slight variability in the derived microbiomes. In contrast, the VOC metabolomes differ widely between the home collected and endoscopy collected samples. Additionally, as the VOC extraction profile is hyperbolic, with short extraction durations more vulnerable to variation than extractions continued to equilibrium, a second goal of our investigation was to ascertain if hSPME-based fecal metabolomics studies might be biased by the extraction duration employed. As anticipated, prolonged extraction (18 hours) results in the identification of considerably more metabolites than short (20 minute) extractions. A comparison of the metabolomes reveals several analytes deemed unique to a cohort with the 20 minute extraction, but found common to both cohorts when the VOC extraction was performed for 18 hours. Moreover, numerous analytes perceived to have significant fold change with a 20 minute extraction were found insignificant in fold change with the prolonged extraction, underscoring the potential for bias associated with a 20 minute hSPME. PMID:24260553

  6. The Approach to Sample Acquisition and Its Impact on the Derived Human Fecal Microbiome and VOC Metabolome

    PubMed Central

    Couch, Robin D.; Navarro, Karl; Sikaroodi, Masoumeh; Gillevet, Pat; Forsyth, Christopher B.; Mutlu, Ece; Engen, Phillip A.; Keshavarzian, Ali

    2013-01-01

    Recent studies have illustrated the importance of the microbiota in maintaining a healthy state, as well as promoting disease states. The intestinal microbiota exerts its effects primarily through its metabolites, and metabolomics investigations have begun to evaluate the diagnostic and health implications of volatile organic compounds (VOCs) isolated from human feces, enabled by specialized sampling methods such as headspace solid-phase microextraction (hSPME). The approach to stool sample collection is an important consideration that could potentially introduce bias and affect the outcome of a fecal metagenomic and metabolomic investigation. To address this concern, a comparison of endoscopically collected (in vivo) and home collected (ex vivo) fecal samples was performed, revealing slight variability in the derived microbiomes. In contrast, the VOC metabolomes differ widely between the home collected and endoscopy collected samples. Additionally, as the VOC extraction profile is hyperbolic, with short extraction durations more vulnerable to variation than extractions continued to equilibrium, a second goal of our investigation was to ascertain if hSPME-based fecal metabolomics studies might be biased by the extraction duration employed. As anticipated, prolonged extraction (18 hours) results in the identification of considerably more metabolites than short (20 minute) extractions. A comparison of the metabolomes reveals several analytes deemed unique to a cohort with the 20 minute extraction, but found common to both cohorts when the VOC extraction was performed for 18 hours. Moreover, numerous analytes perceived to have significant fold change with a 20 minute extraction were found insignificant in fold change with the prolonged extraction, underscoring the potential for bias associated with a 20 minute hSPME. PMID:24260553

  7. The World of Hidden Biases: From Collection to Sample Processing

    NASA Astrophysics Data System (ADS)

    Maurette, Michel

    Any study of micrometeorites involves a variety of biases, which start right away during their collection, and which have not been suffciently publicized. This section deals with the astonishing folklore of these biases. We shall question whether major differences observed between Antarctic micrometeorites and stratospheric micrometeorites could reflect kinds of complementary biases between the two collections of micrometeorites. Astonishingly, some of them would converge to enrich the SMMs collection in the most fine-grained fluffy dust particles accreted by the Earth. They might be possibly the most primitive material accreted by the Earth. But they would not give a representative sampling of the bulk micrometeorite flux, which is best obtained with the new Concordia micrometeorites collected in central Antarctica. For a change, biases developing around a small metallic plate flying at ~200m/sec in the stratosphere turned out to be quite helpful!

  8. The Relationship between Previous Training in Computer Science and the Acquisition of Word Processing Skills.

    ERIC Educational Resources Information Center

    Monahan, Brian D.

    1986-01-01

    This study investigated whether computer science educational background makes secondary students more adept at using word processing capabilities, and compared computer science and non-computer science students' writing improvement with word processing use. Computer science students used more sophisticated program features but student writing did…

  9. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    ERIC Educational Resources Information Center

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2013-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with…

  10. Analysis of protein biomarkers in human clinical tumor samples: critical aspects to success from tissue acquisition to analysis.

    PubMed

    Warren, Madhuri V; Chan, W Y Iris; Ridley, John M

    2011-04-01

    There has been increased interest in the analysis of protein biomarkers in clinical tumor tissues in recent years. Tissue-based biomarker assays can add value and aid decision-making at all stages of drug development, as well as being developed for use as predictive biomarkers and for patient stratification and prognostication in the clinic. However, there must be an awareness of the legal and ethical issues related to the sourcing of human tissue samples. This article also discusses the limits of scope and critical aspects on the successful use of the following tissue-based methods: immunohistochemistry, tissue microarrays and automated image analysis. Future advances in standardization of tissue biobanking methods, immunohistochemistry and quantitative image analysis techniques are also discussed. PMID:21473728

  11. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  12. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  13. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  14. A Multi-Threshold Sampling Method for TOF PET Signal Processing.

    PubMed

    Kim, H; Kao, C M; Xie, Q; Chen, C T; Zhou, L; Tang, F; Frisch, H; Moses, W W; Choong, W S

    2009-04-21

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multi-threshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to 8 threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25×6.25×25mm(3) LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an ∼18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44ns with an ∼9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain ∼300 ps coincidence timing resolution, ∼14% energy resolution at 511 keV, and ∼5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition. PMID:19690623

  15. A multi-threshold sampling method for TOF PET signal processing

    SciTech Connect

    Kim, Heejong; Kao, Chien-Min; Xie, Q.; Chen, Chin-Tu; Zhou, L.; Tang, F.; Frisch, Henry; Moses, William W.; Choong, Woon-Seng

    2009-02-02

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multithreshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to 8 threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25 x 6.25 x 25mm{sup 3} LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an {approx}18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an {approx}9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain {approx}300 ps coincidence timing resolution, {approx}14% energy resolution at 511 keV, and {approx}5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  16. Hardware acceleration of lucky-region fusion (LRF) algorithm for image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Maignan, William; Koeplinger, David; Carhart, Gary W.; Aubailly, Mathieu; Kiamilev, Fouad; Liu, J. Jiang

    2013-05-01

    "Lucky-region fusion" (LRF) is an image processing technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm extracts sharp regions of an image obtained from a series of short exposure frames, and "fuses" them into a final image with improved quality. In previous research, the LRF algorithm had been implemented on a PC using a compiled programming language. However, the PC usually does not have sufficient processing power to handle real-time extraction, processing and reduction required when the LRF algorithm is applied not to single picture images but rather to real-time video from fast, high-resolution image sensors. This paper describes a hardware implementation of the LRF algorithm on a Virtex 6 field programmable gate array (FPGA) to achieve real-time video processing. The novelty in our approach is the creation of a "black box" LRF video processing system with a standard camera link input, a user controller interface, and a standard camera link output.

  17. Second-first language acquisition: analysis of expressive language skills in a sample of girls adopted from China.

    PubMed

    Tan, Tony Xing; Loker, Troy; Dedrick, Robert F; Marfo, Kofi

    2012-03-01

    In this study we investigated adopted Chinese girls' expressive English language outcomes in relation to their age at adoption, chronological age, length of exposure to English and developmental risk status at the time of adoption. Vocabulary and phrase utterance data on 318 girls were collected from the adoptive mothers using the Language Development Survey (LDS) (Achenbach & Rescorla, 2000). The girls, aged 18-35 months (M=26·2 months, SD=4·9 months), were adopted at ages ranging from 6·8 to 24 months (M=12·6 months, SD=3·1 months), and had been exposed to English for periods ranging from 1·6 to 27·6 months (M=13·7, SD=5·7). Findings suggest that vocabulary and mean length of phrase scores were negatively correlated with age at adoption but positively correlated with chronological age and length of exposure to English. Developmental risk status at the time of adoption was not correlated with language outcomes. The gap between their expressive language and that of same-age girls from the US normative sample was wider for children aged 18-23 months but was closed for children aged 30-35 months. About 16% of the children met the LDS criteria for delays in vocabulary and 17% met the LDS criteria for delays in mean length of phrase. Speech/language interventions were received by 33·3% of the children with delays in vocabulary and 25% with delays in phrase. PMID:21781372

  18. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  19. Fast multi-dimensional NMR acquisition and processing using the sparse FFT.

    PubMed

    Hassanieh, Haitham; Mayzel, Maxim; Shi, Lixin; Katabi, Dina; Orekhov, Vladislav Yu

    2015-09-01

    Increasing the dimensionality of NMR experiments strongly enhances the spectral resolution and provides invaluable direct information about atomic interactions. However, the price tag is high: long measurement times and heavy requirements on the computation power and data storage. We introduce sparse fast Fourier transform as a new method of NMR signal collection and processing, which is capable of reconstructing high quality spectra of large size and dimensionality with short measurement times, faster computations than the fast Fourier transform, and minimal storage for processing and handling of sparse spectra. The new algorithm is described and demonstrated for a 4D BEST-HNCOCA spectrum. PMID:26123316

  20. How human resource organization can enhance space information acquisition and processing: the experience of the VENESAT-1 ground segment

    NASA Astrophysics Data System (ADS)

    Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica

    2009-12-01

    Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM

  1. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds. PMID:25528287

  2. The Effectiveness of Processing Instruction in L2 Grammar Acquisition: A Narrative Review

    ERIC Educational Resources Information Center

    Dekeyser, Robert; Botana, Goretti Prieto

    2015-01-01

    The past two decades have seen ample debate about processing instruction (PI) and its various components. In this article, we first describe what PI consists of and then address three questions: about the role of explicit information (EI) in PI, the difference between PI and teaching that incorporates production-based (PB) practice, and various…

  3. Analyzing Preschoolers' Overgeneralizations of Object Labeling in the Process of Mother-Tongue Acquisition in Turkey

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2006-01-01

    Language, as is known, is acquired under certain conditions: rapid and sequential brain maturation and cognitive development, the need to exchange information and to control others' actions, and an exposure to appropriate speech input. This research aims at analyzing preschoolers' overgeneralizations of the object labeling process in different…

  4. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    ERIC Educational Resources Information Center

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  5. How Explicit Knowledge Affects Online L2 Processing: Evidence from Differential Object Marking Acquisition

    ERIC Educational Resources Information Center

    Andringa, Sible; Curcic, Maja

    2015-01-01

    Form-focused instruction studies generally report larger gains for explicit types of instruction over implicit types on measures of controlled production. Studies that used online processing measures--which do not readily allow for the application of explicit knowledge--however, suggest that this advantage occurs primarily when the target…

  6. The RFP Process: Effective Management of the Acquisition of Library Materials.

    ERIC Educational Resources Information Center

    Wilkinson, Frances C.; Thorson, Connie Capers

    Many librarians view procurement, with its myriad forms, procedures, and other organizational requirements, as a tedious or daunting challenge. This book simplifies the process, showing librarians how to successfully prepare a Request for Proposal (RFP) and make informed decisions when determining which vendors to use for purchasing library…

  7. Production and Processing Asymmetries in the Acquisition of Tense Morphology by Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Chondrogianni, Vasiliki; Marinis, Theodoros

    2012-01-01

    This study investigates the production and online processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty-nine six- to nine-year-old L2 children and twenty-eight typically developing age-matched monolingual (L1) children were administered the production…

  8. Using Eye-Tracking to Investigate Topics in L2 Acquisition and L2 Processing

    ERIC Educational Resources Information Center

    Roberts, Leah; Siyanova-Chanturia, Anna

    2013-01-01

    Second language (L2) researchers are becoming more interested in both L2 learners' knowledge of the target language and how that knowledge is put to use during real-time language processing. Researchers are therefore beginning to see the importance of combining traditional L2 research methods with those that capture the moment-by-moment…

  9. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... homogeneity; (4) Dissolution time and rate; (5) Clarity, completeness, or pH of solutions. (6)...

  10. An architecture for real time data acquisition and online signal processing for high throughput tandem mass spectrometry

    SciTech Connect

    Shah, Anuj R.; Jaitly, Navdeep; Zuljevic, Nino; Monroe, Matthew E.; Liyu, Andrei V.; Polpitiya, Ashoka D.; Adkins, Joshua N.; Belov, Mikhail E.; Anderson, Gordon A.; Smith, Richard D.; Gorton, Ian

    2010-12-09

    Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through “intelligent,” online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving “intelligent,” online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an “intelligent” online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.

  11. Human resource processes and the role of the human resources function during mergers and acquisitions in the electricity industry

    NASA Astrophysics Data System (ADS)

    Dass, Ted K.

    Mergers and acquisitions (M&A) have been a popular strategy for organizations to consolidate and grow for more than a century. However, research in this field indicates that M&A are more likely to fail than succeed, with failure rates estimated to be as high as 75%. People-related issues have been identified as important causes for the high failure rate, but these issues are largely neglected until after the deal is closed. One explanation for this neglect is the low involvement of human resource (HR) professionals and the HR function during the M&A process. The strategic HR management literature suggests that a larger role for HR professionals in the M&A process would enable organizations to identify potential problems early and devise appropriate solutions. However, empirical research from an HR perspective has been scarce in this area. This dissertation examines the role of the HR function and the HR processes followed in organizations during M&A. Employing a case-study research design, this study examines M&A undertaken by two large organizations in the electricity industry through the lens of a "process" perspective. Based on converging evidence, the case studies address three sets of related issues: (1) how do organizations undertake and manage M&A; (2) what is the extent of HR involvement in M&A and what role does it play in the M&A process; and (3) what factors explain HR involvement in the M&A process and, more generally, in the formulation of corporate goals and strategies. Results reveal the complexity of issues faced by organizations in undertaking M&A, the variety of roles played by HR professionals, and the importance of several key contextual factors---internal and external to the organization---that influence HR involvement in the M&A process. Further, several implications for practice and future research are explored.

  12. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  13. Apollo experience report: Processing of lunar samples in a sterile nitrogen atmosphere

    NASA Technical Reports Server (NTRS)

    Mcpherson, T. M.

    1972-01-01

    A sterile nitrogen atmosphere processing cabinet line was installed in the Lunar Receiving Laboratory to process returned lunar samples with minimum organic contamination. Design and operation of the cabinet line were complicated by the requirement for biological sterilization and isolation, which necessitated extensive filtration, leak-checking, and system sterilization before use. Industrial techniques were applied to lunar sample processing to meet requirements for time-critical experiments while handling a large flow of samples.

  14. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  15. Processing Temporal Constraints and Some Implications for the Investigation of Second Language Sentence Processing and Acquisition. Commentary on Baggio

    ERIC Educational Resources Information Center

    Roberts, Leah

    2008-01-01

    Baggio presents the results of an event-related potential (ERP) study in which he examines the processing consequences of reading tense violations such as *"Afgelopen zondag lakt Vincent de kozijnen van zijn landhuis" (*"Last Sunday Vincent paints the window-frames of his country house"). The violation is arguably caused by a mismatch between the…

  16. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Technical Reports Server (NTRS)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  17. Age of second language acquisition affects nonverbal conflict processing in children: an fMRI study

    PubMed Central

    Mohades, Seyede Ghazal; Struys, Esli; Van Schuerbeek, Peter; Baeken, Chris; Van De Craen, Piet; Luypaert, Robert

    2014-01-01

    Background In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus–stimulus and stimulus–response conflicts. Three groups of 8–11-year-old children – bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) – were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life. PMID:25328840

  18. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Astrophysics Data System (ADS)

    Li, Y. T.; Wittenberg, L. J.

    1992-09-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  19. An overview of AmeriFlux data products and methods for data acquisition, processing, and publication

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.

    2014-12-01

    The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks

  20. Digital image processing: a primer for JVIR authors and readers: part 2: digital image acquisition.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-11-01

    This is the second installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first article of the series, we reviewed the fundamentals of digital image architecture. In this article, we describe the ways that an author can import digital images to the computer desktop. We explore the modern imaging network and explain how to import picture archiving and communications systems (PACS) images to the desktop. Options and techniques for producing digital hard copy film are also presented. PMID:14605101

  1. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  2. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  3. Data acquisition and processing using noncontact/contact digitizing systems for reverse engineering

    NASA Astrophysics Data System (ADS)

    Motavalli, Saeid; Suharitdamrong, V.

    1994-03-01

    Reverse engineering is the process of creating an engineering design model for existing parts or prototypes. We have developed a reverse engineering system where data is acquired with a scanning system that combines noncontact and contact digitizing methods. The noncontact sensor is a PC-based vision system that views the part from orthogonal orientations and captures the boundary points of the object. The images are then vectorized and a 2D CAD drawing of the part is created. The contact probe is mounted on a CNC machine, which is then guided by the NC code based on the 2D drawings of the part and captures the 3D coordinates of the points inside the boundaries of the object. The 3D coordinates are then used by the surface-modeling module of the system to create a 3D CAD drawing of the part, which is presented in a commercial CAD system. By combining vision sensing with contact probing we achieved speed and accuracy in the data extraction process. This paper describes the elements of the system and the CAD modeling procedure.

  4. Optoelectronic/image processing module for enhanced fringe pattern acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Dymny, Grzegorz; Kujawinska, Malgorzata

    1996-08-01

    The paper introduces an optoelectronic/image processing module, OIMP, which enables more convenient implementation of full-field optical methods of testing into industry. OIMP consist of two miniature CCD cameras and optical wavefront modification system which recombines the beams produced by opto-mechanical measurement system and images fringe patterns on the CCD matrices. The modules makes possible simultaneous registration of there monochromatic images as R,G,B components of color video signal by means of signal frame grabber or by VCR on video tape. This enables convenient and inexpensive storage of large quantities of data which may be analyzed by spatial carrier phase shifting method of automatic fringe pattern analysis. THe usefulness of OIMP is shown by two examples: u and v in-plane displacement simultaneous analysis in grating interferometry system and complex shape determination by fringe projection systems.

  5. Identification of potential biases in the characterization sampling and analysis process

    SciTech Connect

    Winkelman, W.D.; Eberlein, S.J.

    1995-12-01

    The Tank Waste Remediation System (TWRS) Characterization Project is responsible for providing quality characterization data to TWRS. Documentation of sampling and analysis process errors and biases can be used to improve the process to provide that data. The sampling and analysis process consists of removing a sample from a specified waste tank, getting it to the laboratory and analyzing it to provide the data identified in the Tank Characterization Plan (TCP) and Sampling and Analysis Plan (SAP). To understand the data fully, an understanding of errors or biases that can be generated during the process is necessary. Most measurement systems have the ability statistically to detect errors and biases by using standards and alternate measurement techniques. Only the laboratory analysis part of the tank sampling and analysis process at TWRS has this ability. Therefore, it is necessary to use other methods to identify and prioritize the biases involved in the process.

  6. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalised cross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  7. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalisedcross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  8. Acquisition and processing pitfall with clipped traces in surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Gao, Lingli; Pan, Yudi

    2016-02-01

    Multichannel analysis of surface waves (MASW) is widely used in estimating near-surface shear (S)-wave velocity. In the MASW method, generating a reliable dispersion image in the frequency-velocity (f-v) domain is an important processing step. A locus along peaks of dispersion energy at different frequencies allows the dispersion curves to be constructed for inversion. When the offsets are short, the output seismic data may exceed the dynamic ranges of geophones/seismograph, as a result of which, peaks and (or) troughs of traces will be squared off in recorded shot gathers. Dispersion images generated by the raw shot gathers with clipped traces would be contaminated by artifacts, which might be misidentified as Rayleigh-wave phase velocities or body-wave velocities and potentially lead to incorrect results. We performed some synthetic models containing clipped traces, and analyzed amplitude spectra of unclipped and clipped waves. The results indicate that artifacts in the dispersion image are dependent on the level of clipping. A real-world example also shows how clipped traces would affect the dispersion image. All the results suggest that clipped traces should be removed from the shot gathers before generating dispersion images, in order to pick accurate phase velocities and set reasonable initial inversion models.

  9. Acquisition and processing of data for isotope-ratio-monitoring mass spectrometry

    NASA Technical Reports Server (NTRS)

    Ricci, M. P.; Merritt, D. A.; Freeman, K. H.; Hayes, J. M.

    1994-01-01

    Methods are described for continuous monitoring of signals required for precise analyses of 13C, 18O, and 15N in gas streams containing varying quantities of CO2 and N2. The quantitative resolution (i.e. maximum performance in the absence of random errors) of these methods is adequate for determination of isotope ratios with an uncertainty of one part in 10(5); the precision actually obtained is often better than one part in 10(4). This report describes data-processing operations including definition of beginning and ending points of chromatographic peaks and quantitation of background levels, allowance for effects of chromatographic separation of isotopically substituted species, integration of signals related to specific masses, correction for effects of mass discrimination, recognition of drifts in mass spectrometer performance, and calculation of isotopic delta values. Characteristics of a system allowing off-line revision of parameters used in data reduction are described and an algorithm for identification of background levels in complex chromatograms is outlined. Effects of imperfect chromatographic resolution are demonstrated and discussed and an approach to deconvolution of signals from coeluting substances described.

  10. eL-Chem Viewer: A Freeware Package for the Analysis of Electroanalytical Data and Their Post-Acquisition Processing

    PubMed Central

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-01-01

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, ‘eL-Chem Viewer’, which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm. PMID:25090415

  11. NREL Develops Accelerated Sample Activation Process for Hydrogen Storage Materials (Fact Sheet)

    SciTech Connect

    Not Available

    2010-12-01

    This fact sheet describes NREL's accomplishments in developing a new sample activation process that reduces the time to prepare samples for measurement of hydrogen storage from several days to five minutes and provides more uniform samples. Work was performed by NREL's Chemical and Materials Science Center.

  12. SAMPLING SYSTEM EVALUATION FOR HIGH-TEMPERATURE, HIGH-PRESSURE PROCESSES

    EPA Science Inventory

    The report describes a sampling system designed for the high temperatures and high pressures found in pressurized fluidized-bed combustors (PFBC). The system uses an extractive sampling approach, withdrawing samples from the process stream for complete analysis of particulate siz...

  13. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA processing documents. (a) This section...

  14. IECON '87: Signal acquisition and processing; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Astrophysics Data System (ADS)

    Niederjohn, Russell J.

    1987-01-01

    Theoretical and applications aspects of signal processing are examined in reviews and reports. Topics discussed include speech processing methods, algorithms, and architectures; signal-processing applications in motor and power control; digital signal processing; signal acquisition and analysis; and processing algorithms and applications. Consideration is given to digital coding of speech algorithms, an algorithm for continuous-time processes in discrete-time measurement, quantization noise and filtering schemes for digital control systems, distributed data acquisition for biomechanics research, a microcomputer-based differential distance and velocity measurement system, velocity observations from discrete position encoders, a real-time hardware image preprocessor, and recognition of partially occluded objects by a knowledge-based system.

  15. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  16. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay

    PubMed Central

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana; Bech, Bodil Hammer; Fuglsang, Jens; Olsen, Jørn; Nohr, Ellen Aagaard

    2015-01-01

    Background In studies of perfluoroalkyl acids, the validity and comparability of measured concentrations may be affected by differences in the handling of biospecimens. We aimed to investigate whether measured plasma levels of perfluoroalkyl acids differed between blood samples subjected to delay and transportation prior to processing and samples with immediate processing and freezing. Methods Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. Results For samples taken in the winter, relative differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate there was no difference between the two setups [corresponding estimate 1% (0, 3)]. Differences were negligible in the summer for all compounds. Conclusions Transport of blood samples and processing delay, similar to conditions applied in some large, population-based studies, may affect measured perfluoroalkyl acid concentrations, mainly when outdoor temperatures are low. Attention to processing conditions is needed in studies of perfluoroalkyl acid exposure in humans. PMID:26356420

  17. Since When or How Often? Dissociating the Roles of Age of Acquisition (AoA) and Lexical Frequency in Early Visual Word Processing

    ERIC Educational Resources Information Center

    Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado

    2013-01-01

    The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…

  18. NMDA Receptor-Dependent Processes in the Medial Prefrontal Cortex Are Important for Acquisition and the Early Stage of Consolidation during Trace, but Not Delay Eyeblink Conditioning

    ERIC Educational Resources Information Center

    Takehara-Nishiuchi, Kaori; Kawahara, Shigenori; Kirino, Yutaka

    2005-01-01

    Permanent lesions in the medial prefrontal cortex (mPFC) affect acquisition of conditioned responses (CRs) during trace eyeblink conditioning and retention of remotely acquired CRs. To clarify further roles of the mPFC in this type of learning, we investigated the participation of the mPFC in mnemonic processes both during and after daily…

  19. Floodnet: a telenetwork for acquisition, processing and dissemination of earth observation data for monitoring and emergency management of floods

    NASA Astrophysics Data System (ADS)

    Blyth, Ken

    1997-08-01

    The aim of FLOODNET is to provide a communications and data distribution facility specifically designed to meet the demanding temporal requirements of flood monitoring within the European Union (EU). Currently, remotely sensed data are not fully utilized for flood applications because potential users are not familiar with the procedure for acquiring the data and do not have a defined route for obtaining help in processing and interpreting the data. FLOODNET will identify the potential user groups within the EU and will, by demonstration, education and the use of telematics, increase the awareness of users to the capabilities of earth observation (EO) and the means by which they can acquire EO data. FLOODNET will act as a filter between users and satellite operation planners to help assign priorities for data acquisition against previously agreed criteria. The network will encourage a user community and will facilitate cross-sector information transfer, particularly between flood experts and administrative decision makers. The requirement for two levels of flood mapping is identified: (1) a rapid, broad-brush approach to assess the general flood situation and identify areas at greatest risk and in need of immediate assistance; (2) a detailed mapping approach, less critical in time, suitable for input to hydrological models or for flood risk evaluation. A likely networking technology is outlined, the basic functionality of a FLOODNET demonstrator is described and some of the economic benefits of the network are identified.

  20. Integrated Processing of High Resolution Topographic Data for Soil Erosion Assessment Considering Data Acquisition Schemes and Surface Properties

    NASA Astrophysics Data System (ADS)

    Eltner, A.; Schneider, D.; Maas, H.-G.

    2016-06-01

    Soil erosion is a decisive earth surface process strongly influencing the fertility of arable land. Several options exist to detect soil erosion at the scale of large field plots (here 600 m²), which comprise different advantages and disadvantages depending on the applied method. In this study, the benefits of unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are exploited to quantify soil surface changes. Beforehand data combination, TLS data is co-registered to the DEMs generated with UAV photogrammetry. TLS data is used to detect global as well as local errors in the DEMs calculated from UAV images. Additionally, TLS data is considered for vegetation filtering. Complimentary, DEMs from UAV photogrammetry are utilised to detect systematic TLS errors and to further filter TLS point clouds in regard to unfavourable scan geometry (i.e. incidence angle and footprint) on gentle hillslopes. In addition, surface roughness is integrated as an important parameter to evaluate TLS point reliability because of the increasing footprints and thus area of signal reflection with increasing distance to the scanning device. The developed fusion tool allows for the estimation of reliable data points from each data source, considering the data acquisition geometry and surface properties, to finally merge both data sets into a single soil surface model. Data fusion is performed for three different field campaigns at a Mediterranean field plot. Successive DEM evaluation reveals continuous decrease of soil surface roughness, reappearance of former wheel tracks and local soil particle relocation patterns.

  1. Variation in the application of natural processes: language-dependent constraints in the phonological acquisition of bilingual children.

    PubMed

    Faingold, E D

    1996-09-01

    This paper studies phonological processes and constraints on early phonological and lexical development, as well as the strategies employed by a young Spanish-, Portuguese-, and Hebrew-speaking child-Nurit (the author's niece)-in the construction of her early lexicon. Nurit's linguistic development is compared to that of another Spanish-, Portuguese-, and Hebrew-speaking child-Noam (the author's son). Noam and Nurit's linguistic development is contrasted to that of Berman's (1977) English- and Hebrew-speaking daughter (Shelli). The simultaneous acquisition of similar (closely related languages) such as Spanish and Portuguese versus that of nonrelated languages such as English and Hebrew yields different results: Children acquiring similar languages seem to prefer maintenance as a strategy for the construction of their early lexicon, while children exposed to nonrelated languages appear to prefer reduction to a large extent (Faingold, 1990). The Spanish- and Portuguese-speaking children's high accuracy stems from a wider choice of target words, where the diachronic development of two closely related languages provides a simplified model lexicon to the child. PMID:8865623

  2. Evaluation of process errors in bed load sampling using a dune model

    USGS Publications Warehouse

    Gomez, B.; Troutman, B.M.

    1997-01-01

    Reliable estimates of the streamwide bed load discharge obtained using sampling devices are dependent upon good at-a-point knowledge across the full width of the channel. Using field data and information derived from a model that describes the geometric features of a dune train in terms of a spatial process observed at a fixed point in time, we show that sampling errors decrease as the number of samples collected increases, and the number of traverses of the channel over which the samples are collected increases. It also is preferable that bed load sampling be conducted at a pace which allows a number of bed forms to pass through the sampling cross section. The situations we analyze and simulate pertain to moderate transport conditions in small rivers. In such circumstances, bed load sampling schemes typically should involve four or five traverses of a river, and the collection of 20-40 samples at a rate of five or six samples per hour. By ensuring that spatial and temporal variability in the transport process is accounted for, such a sampling design reduces both random and systematic errors and hence minimizes the total error involved in the sampling process.

  3. Simultaneous Acquisition of 2D and 3D Solid-State NMR Experiments for Sequential Assignment of Oriented Membrane Protein Samples

    PubMed Central

    Gopinath, T.; Mote, Kaustubh R; Veglia, Gianluigi

    2016-01-01

    We present a new method called DAISY (Dual Acquisition orIented ssNMR spectroScopY) for the simultaneous acquisition of 2D and 3D oriented solid-state NMR experiments for membrane proteins aligned in mechanically or magnetically lipid bilayers. DAISY utilizes dual acquisition of sine and cosine dipolar or chemical shift coherences and long living 15N longitudinal polarization to obtain two multi-dimensional spectra, simultaneously. In these new experiments, the first acquisition gives the polarization inversion spin exchange at the magic angle (PISEMA) or heteronuclear correlation (HETCOR) spectra, the second acquisition gives PISEMA-mixing or HETCOR-mixing spectra, where the mixing element enables inter-residue correlations through 15N-15N homonuclear polarization transfer. The analysis of the two 2D spectra (first and second acquisitions) enables one to distinguish 15N-15N inter-residue correlations for sequential assignment of membrane proteins. DAISY can be implemented in 3D experiments that include the polarization inversion spin exchange at magic angle via I spin coherence (PISEMAI) sequence, as we show for the simultaneous acquisition of 3D PISEMAI-HETCOR and 3D PISEMAI-HETCOR-mixing experiments. PMID:25749871

  4. Simultaneous acquisition of 2D and 3D solid-state NMR experiments for sequential assignment of oriented membrane protein samples.

    PubMed

    Gopinath, T; Mote, Kaustubh R; Veglia, Gianluigi

    2015-05-01

    We present a new method called DAISY (Dual Acquisition orIented ssNMR spectroScopY) for the simultaneous acquisition of 2D and 3D oriented solid-state NMR experiments for membrane proteins reconstituted in mechanically or magnetically aligned lipid bilayers. DAISY utilizes dual acquisition of sine and cosine dipolar or chemical shift coherences and long living (15)N longitudinal polarization to obtain two multi-dimensional spectra, simultaneously. In these new experiments, the first acquisition gives the polarization inversion spin exchange at the magic angle (PISEMA) or heteronuclear correlation (HETCOR) spectra, the second acquisition gives PISEMA-mixing or HETCOR-mixing spectra, where the mixing element enables inter-residue correlations through (15)N-(15)N homonuclear polarization transfer. The analysis of the two 2D spectra (first and second acquisitions) enables one to distinguish (15)N-(15)N inter-residue correlations for sequential assignment of membrane proteins. DAISY can be implemented in 3D experiments that include the polarization inversion spin exchange at magic angle via I spin coherence (PISEMAI) sequence, as we show for the simultaneous acquisition of 3D PISEMAI-HETCOR and 3D PISEMAI-HETCOR-mixing experiments. PMID:25749871

  5. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  6. Understanding scaling through history-dependent processes with collapsing sample space.

    PubMed

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-04-28

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x) ~ x(-λ), where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α = 2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf's law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes. PMID:25870294

  7. Understanding scaling through history-dependent processes with collapsing sample space

    PubMed Central

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-01-01

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf’s law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x)∼x−λ, where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α=2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf’s law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes. PMID:25870294

  8. SAMPLING AND ANALYSIS OF REDUCED AND OXIDIZED SPECIES IN PROCESS STREAMS

    EPA Science Inventory

    The report gives results of a program of over 60 tasks involving the evaluation, development, testing, and field adaptation of measurement techniques for elemental analysis and inorganic compound identification in process and effluent streams. Procedures for particulate sampling ...

  9. Modular tube/plate-based sample management: a business model optimized for scalable storage and processing.

    PubMed

    Fillers, W Steven

    2004-12-01

    Modular approaches to sample management allow staged implementation and progressive expansion of libraries within existing laboratory space. A completely integrated, inert atmosphere system for the storage and processing of a variety of microplate and microtube formats is currently available as an integrated series of individual modules. Liquid handling for reformatting and replication into microplates, plus high-capacity cherry picking, can be performed within the inert environmental envelope to maximize compound integrity. Complete process automation provides ondemand access to samples and improved process control. Expansion of such a system provides a low-risk tactic for implementing a large-scale storage and processing system. PMID:15674027

  10. Mars sample return - Science

    NASA Technical Reports Server (NTRS)

    Blanchard, Douglas P.

    1988-01-01

    The possible scientific goals of a Mars sample return mission are reviewed, including the value of samples and the selection of sampling sites. The fundamental questions about Mars which could be studied using samples are examined, including planetary formation, differentiation, volcanism and petrogenesis, weathering, and erosion. Scenarios are presented for sample acquisition and analysis. Possible sampling methods and tools are discussed, including drilling techniques, types of rovers, and processing instruments. In addition, the possibility of aerocapture out of elliptical or circular orbit is considered.

  11. COMPARISON OF METHODS FOR SAMPLING BACTERIA AT SOLID WASTE PROCESSING FACILITIES

    EPA Science Inventory

    The report is an assessment of the field sampling methodologies used to measure concentrations of airborne bacteria and viruses in and around waste handling and processing facilities. The sampling methods are discussed as well as the problems encountered and subsequent changes ma...

  12. EFFECT OF SAMPLE PROCESSING PROCEDURES ON MEASUREMENT OF STARCH IN CORN SILAGE AND CORN GRAIN

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for processing feedstuffs before analysis can affect analytical results. The effects of drying temperature and grinding method on starch analysis of corn silage and of grinding method on corn grain were evaluated. Corn silage samples dried at 55°C or 105°C and grain samples dried at 55°C w...

  13. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  14. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  15. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  16. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  17. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  18. CHARACTERIZATION OF TANK 11H AND TANK 51H POST ALUMINUM DISSOLUTION PROCESS SAMPLES

    SciTech Connect

    Hay, M; Daniel McCabe, D

    2008-05-16

    A dip sample of the liquid phase from Tank 11H and a 3-L slurry sample from Tank 51H were obtained and sent to Savannah River National Laboratory for characterization. These samples provide data to verify the amount of aluminum dissolved from the sludge as a result of the low temperature aluminum dissolution process conducted in Tank 51H. The characterization results for the as-received Tank 11H and Tank 51H supernate samples and the total dried solids of the Tank 51H sludge slurry sample appear quite good with respect to the precision of the sample replicates and minimal contamination present in the blank. The two supernate samples show similar concentrations for the major components as expected.

  19. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  20. Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector: Inital Results

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Blatt, Stephan; Holub, Kirk

    2015-04-01

    In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from four sub networks of GPS stations located 1. near NOAA Radiosonde Observation (Upper-Air Observation) launch sites; 2. Stations with low terrain/high moisture variability (Gulf Coast); 3. Stations with high terrain/low moisture variability (Southern California); and 4. Stations with high terrain/high moisture variability (high terrain variability elev. > 1000m). For each network GSD and T/ENI run the same stations for 30 days, compare results, and perform an evaluation of the long-term solution accuracy, precision and reliability. Metrics for success include T/ENI PWV estimates within 1.5 mm of ESRL/GSD's estimates 95% of the time (ZTD uncertainty of less than 10 mm 95% of the time). The threshold for allowable variations in ZTD between NOAA GPS-Met and T/ENI processing are 10mm. The CRADA 1&2 Trimble processing

  1. Unexpected toxicity to aquatic organisms of some aqueous bisphenol A samples treated by advanced oxidation processes.

    PubMed

    Tišler, Tatjana; Erjavec, Boštjan; Kaplan, Renata; Şenilă, Marin; Pintar, Albin

    2015-01-01

    In this study, photocatalytic and catalytic wet-air oxidation (CWAO) processes were used to examine removal efficiency of bisphenol A from aqueous samples over several titanate nanotube-based catalysts. Unexpected toxicity of bisphenol A (BPA) samples treated by means of the CWAO process to some tested species was determined. In addition, the CWAO effluent was recycled five- or 10-fold in order to increase the number of interactions between the liquid phase and catalyst. Consequently, the inductively coupled plasma mass spectrometry (ICP-MS) analysis indicated higher concentrations of some toxic metals like chromium, nickel, molybdenum, silver, and zinc in the recycled samples in comparison to both the single-pass sample and the photocatalytically treated solution. The highest toxicity of five- and 10-fold recycled solutions in the CWAO process was observed in water fleas, which could be correlated to high concentrations of chromium, nickel, and silver detected in tested samples. The obtained results clearly demonstrated that aqueous samples treated by means of advanced oxidation processes should always be analyzed using (i) chemical analyses to assess removal of BPA and total organic carbon from treated aqueous samples, as well as (ii) a battery of aquatic organisms from different taxonomic groups to determine possible toxicity. PMID:26114268

  2. Capillary absorption spectrometer and process for isotopic analysis of small samples

    DOEpatents

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  3. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  4. Three-Step Validation of Exercise Behavior Processes of Change in an Adolescent Sample

    ERIC Educational Resources Information Center

    Rhodes, Ryan E.; Berry, Tanya; Naylor, Patti-Jean; Higgins, S. Joan Wharf

    2004-01-01

    Though the processes of change are conceived as the core constructs of the transtheoretical model (TTM), few researchers have examined their construct validity in the physical activity domain. Further, only 1 study was designed to investigate the processes of change in an adolescent sample. The purpose of this study was to examine the exercise…

  5. Effect of histologic processing on dimensions of skin samples obtained from cat cadavers.

    PubMed

    Jeyakumar, Sakthila; Smith, Annette N; Schleis, Stephanie E; Cattley, Russell C; Tillson, D Michael; Henderson, Ralph A

    2015-11-01

    OBJECTIVE To determine changes in dimensions of feline skin samples as a result of histologic processing and to identify factors that contributed to changes in dimensions of skin samples after sample collection. SAMPLE Cadavers of 12 clinically normal cats. PROCEDURES Skin samples were obtained bilaterally from 3 locations (neck, thorax, and tibia) of each cadaver; half of the thoracic samples included underlying muscle. Length, width, and depth were measured at 5 time points (before excision, after excision, after application of ink to mark tissue margins, after fixation in neutral-buffered 10% formalin for 36 hours, and after completion of histologic processing and staining with H&E stain). Measurements obtained after sample collection were compared with measurements obtained before excision. RESULTS At the final time point, tissue samples had decreased in length (mean decrease, 32.40%) and width (mean decrease, 34.21%) and increased in depth (mean increase, 54.95%). Tissue from the tibia had the most shrinkage in length and width and that from the neck had the least shrinkage. Inclusion of underlying muscle on thoracic skin samples did not affect the degree of change in dimensions. CONCLUSIONS AND CLINICAL RELEVANCE In this study, each step during processing from excision to formalin fixation and histologic processing induced changes in tissue dimensions, which were manifested principally as shrinkage in length and width and increase in depth. Most of the changes occured during histologic processing. Inclusion of muscle did not affect thoracic skin shrinkage. Shrinkage should be a consideration when interpreting surgical margins in clinical cases. 945). PMID:26512538

  6. Study on Electro-Polishing Process by Niobium-Plate Sample With Artificial Pits

    SciTech Connect

    T. Saeki, H. Hayano, S. Kato, M. Nishiwaki, M. Sawabe, W.A. Clemens, R.L. Geng, R. Manus, P.V. Tyagi

    2011-07-01

    The Electro-polishing (EP) process is the best candidate of final surface-treatment for the production of ILC cavities. Nevertheless, the development of defects on the inner-surface of the Superconducting RF cavity during EP process has not been studied by experimental method. We made artificial pits on the surface of a Nb-plate sample and observed the development of the pit-shapes after each step of 30um-EP process where 120um was removed by EP in total. This article describes the results of this EP-test of Nb-sample with artificial pits.

  7. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  8. Down sampled signal processing for a B Factory bunch-by-bunch feedback system

    SciTech Connect

    Hindi, H.; Hosseini, W.; Briggs, D.; Fox, J.; Hutton, A.

    1992-03-01

    A bunch-by-bunch feedback scheme is studied for damping coupled bunch synchrotron oscillations in the proposed PEP II B Factory. The quasi-linear feedback systems design incorporates a phase detector to provide a quantized measure of bunch phase, digital signal processing to compute an error correction signal and a kicker system to correct the energy of the bunches. A farm of digital processors, operating in parallel, is proposed to compute correction signals for the 1658 bunches of the B Factory. This paper studies the use of down sampled processing to reduce the computational complexity of the feedback system. We present simulation results showing the effect of down sampling on beam dynamics. Results show that down sampled processing can reduce the scale of the processing task by a factor of 10.

  9. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY'S (DWPF) PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 7A QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Edwards, T.; Jones, M.; Wiedenman, B.

    2011-03-14

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO{sub 3} acid dissolution (i.e., DWPF Cold Chem Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestions of Sludge Batch 7a (SB7a) SRAT Receipt and SB7a SRAT Product samples. The SB7a SRAT Receipt and SB7a SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constituates the SB7a Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 6 (SB6), to form the Sb7a Blend composition.

  10. How can wireless, mobile data acquisition be used for taking part of the lab to the sample, and how can it join the internet of things?

    NASA Astrophysics Data System (ADS)

    Trzcinski, Peter; Karanassios, Vassili

    2016-05-01

    During the last several years, the world has moved from wired communications (e.g., a wired ethernet, wired telephone) to wireless communications (e.g., cell phones, smart phones, tablets). However, data acquisition has lagged behind and for the most part, data in laboratory settings are still acquired using wired communications (or even plug in boards). In this paper, approaches that can be used for wireless data acquisition are briefly discussed using a conceptual model of a future, mobile, portable micro-instrument as an example. In addition, past, present and near-future generations of communications are discussed; processors, operating systems and benchmarks are reviewed; networks that may be used for data acquisition in the field are examined; and, the possibility of connecting sensor or micro-instrument networks to the internet of things is postulated.

  11. Technical Note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, T.; Bach, L. T.; Czerny, J.; Riebesell, U.

    2015-11-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial decoupling between particle formation and collection often handicaps reconciliation of these two processes even within the euphotic zone. Pelagic mesocosms have the advantage of being closed systems and are therefore ideally suited to study how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently concentrated by passive settling, centrifugation or flocculation with ferric chloride and we discuss the advantages of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranges from fine to coarse silt (2-63 μm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements and even at very low particle fluxes we were able to get a detailed insight on various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  12. Sampling plan optimization for detection of lithography and etch CD process excursions

    NASA Astrophysics Data System (ADS)

    Elliott, Richard C.; Nurani, Raman K.; Lee, Sung Jin; Ortiz, Luis G.; Preil, Moshe E.; Shanthikumar, J. G.; Riley, Trina; Goodwin, Greg A.

    2000-06-01

    Effective sample planning requires a careful combination of statistical analysis and lithography engineering. In this paper, we present a complete sample planning methodology including baseline process characterization, determination of the dominant excursion mechanisms, and selection of sampling plans and control procedures to effectively detect the yield- limiting excursions with a minimum of added cost. We discuss the results of our novel method in identifying critical dimension (CD) process excursions and present several examples of poly gate Photo and Etch CD excursion signatures. Using these results in a Sample Planning model, we determine the optimal sample plan and statistical process control (SPC) chart metrics and limits for detecting these excursions. The key observations are that there are many different yield- limiting excursion signatures in photo and etch, and that a given photo excursion signature turns into a different excursion signature at etch with different yield and performance impact. In particular, field-to-field variance excursions are shown to have a significant impact on yield. We show how current sampling plan and monitoring schemes miss these excursions and suggest an improved procedure for effective detection of CD process excursions.

  13. The Effect of Age of Second Language Acquisition on the Representation and Processing of Second Language Words

    ERIC Educational Resources Information Center

    Silverberg, Stu; Samuel, Arthur G.

    2004-01-01

    In this study, the effects of second language (i.e., L2) proficiency and age of second language acquisition are assessed. Three types of bilinguals are compared: Early L2 learners, Late highly proficient L2 learners, and Late less proficient L2 learners. A lexical decision priming paradigm is used in which the critical trials consist of first…

  14. The Symbolic World of the Bilingual Child: Digressions on Language Acquisition, Culture and the Process of Thinking

    ERIC Educational Resources Information Center

    Nowak-Fabrykowski, Krystyna; Shkandrij, Miroslav

    2004-01-01

    In this paper we explore the relationship between language acquisition, and the construction of a symbolic world. According to Bowers (1989) language is a collection of patterns regulating social life. This conception is close to that of Symbolic Interactionists (Charon, 1989) who see society as made up of interacting individuals who are symbol…

  15. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  16. Acquisition of the Linearization Process in Text Composition in Third to Ninth Graders: Effects of Textual Superstructure and Macrostructural Organization

    ERIC Educational Resources Information Center

    Favart, Monik; Coirier, Pierre

    2006-01-01

    o complementary experiments analyzed the acquisition of text content linearization in writing, in French-speaking participants from third to ninth grades. In both experiments, a scrambled text paradigm was used: eleven ideas presented in random order had to be rearranged coherently so as to compose a text. Linearization was analyzed on the basis…

  17. Random Sampling Process Leads to Overestimation of β-Diversity of Microbial Communities

    PubMed Central

    Zhou, Jizhong; Jiang, Yi-Huei; Deng, Ye; Shi, Zhou; Zhou, Benjamin Yamin; Xue, Kai; Wu, Liyou; He, Zhili; Yang, Yunfeng

    2013-01-01

    ABSTRACT The site-to-site variability in species composition, known as β-diversity, is crucial to understanding spatiotemporal patterns of species diversity and the mechanisms controlling community composition and structure. However, quantifying β-diversity in microbial ecology using sequencing-based technologies is a great challenge because of a high number of sequencing errors, bias, and poor reproducibility and quantification. Herein, based on general sampling theory, a mathematical framework is first developed for simulating the effects of random sampling processes on quantifying β-diversity when the community size is known or unknown. Also, using an analogous ball example under Poisson sampling with limited sampling efforts, the developed mathematical framework can exactly predict the low reproducibility among technically replicate samples from the same community of a certain species abundance distribution, which provides explicit evidences of random sampling processes as the main factor causing high percentages of technical variations. In addition, the predicted values under Poisson random sampling were highly consistent with the observed low percentages of operational taxonomic unit (OTU) overlap (<30% and <20% for two and three tags, respectively, based on both Jaccard and Bray-Curtis dissimilarity indexes), further supporting the hypothesis that the poor reproducibility among technical replicates is due to the artifacts associated with random sampling processes. Finally, a mathematical framework was developed for predicting sampling efforts to achieve a desired overlap among replicate samples. Our modeling simulations predict that several orders of magnitude more sequencing efforts are needed to achieve desired high technical reproducibility. These results suggest that great caution needs to be taken in quantifying and interpreting β-diversity for microbial community analysis using next-generation sequencing technologies. PMID:23760464

  18. Extended Characterization of Chemical Processes in Hot Cells Using Environmental Swipe Samples

    SciTech Connect

    Olsen, Khris B.; Mitroshkov, Alexandre V.; Thomas, M-L; Lepel, Elwood A.; Brunson, Ronald R.; Ladd-Lively, Jennifer

    2012-09-15

    Environmental sampling is used extensively by the International Atomic Energy Agency (IAEA) for verification of information from State declarations or a facility’s design regarding nuclear activities occurring within the country or a specific facility. Environmental sampling of hot cells within a facility under safeguards is conducted using 10.2 cm x 10.2 cm cotton swipe material or cellulose swipes. Traditional target analytes used by the IAEA to verify operations within a facility include a select list of gamma-emitting radionuclides and total and isotopic U and Pu. Analysis of environmental swipe samples collected within a hot-cell facility where chemical processing occurs may also provide information regarding specific chemicals used in fuel processing. However, using swipe material to elucidate what specific chemical processes were/are being used within a hot cell has not been previously evaluated. Staff from Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory (ORNL) teamed to evaluate the potential use of environmental swipe samples as collection media for volatile and semivolatile organic compounds. This evaluation was initiated with sample collection during a series of Coupled End-to-End (CETE) reprocessing runs at ORNL. The study included measurement of gamma emitting radionuclides, total and isotopic U and Pu, and volatile and semivolatile organic compounds. These results allowed us to elucidate what chemical processes used in the hot cells during reprocessing of power reactor and identify other legacy chemicals used in hot cell operations which predate the CETE process.

  19. Studies on the Electro-Polishing process with Nb sample plates at KEK

    SciTech Connect

    Saeki, Takayuki; Funahashi, Y.; Hayano, Hitoshi; Kato, Seigo; Nishiwaki, Michiru; Sawabe, Motoaki; Ueno, Kenji; Watanabe, K.; Clemens, William A.; Geng, Rongli; Manus, Robert L.; Tyagi, Puneet

    2009-11-01

    In this article, two subjects would be described. the first subject is on the production of stains on the surface of Nb sample plates in Electro-polishing (EP) process and the second subject is on the development of defects/pits in the EP process on the surface of a Nb sample plate. Recently, some 9-cell cavities were treated with new EP acid at KEK and the performance of these cavities were limited by heavy field emissions. On the inside surface of these cavities, brown stains were observed. We made an effort to reproduce the brown stains on Nb sample plates with an EP setup in laboratory with varying the concentration of Nibium in the EP acid. We found that the brown stains would appear only when processed with new EP acid. In the second subject, we made artificial pits on the surface of a Nb-sample plate and observed the development of the pits after each step of 30um-EP process where 120um was removed in total by the EP process. This article describes these series EP-tests with Nb sample plates at KEK.

  20. A mixed signal ECG processing platform with an adaptive sampling ADC for portable monitoring applications.

    PubMed

    Kim, Hyejung; Van Hoof, Chris; Yazicioglu, Refet Firat

    2011-01-01

    This paper describes a mixed-signal ECG processing platform with an 12-bit ADC architecture that can adapt its sampling rate according to the input signals rate of change. This enables the sampling of ECG signals with significantly reduced data rate without loss of information. The presented adaptive sampling scheme reduces the ADC power consumption, enables the processing of ECG signals with lower power consumption, and reduces the power consumption of the radio while streaming the ECG signals. The test results show that running a CWT-based R peak detection algorithm using the adaptively sampled ECG signals consumes only 45.6 μW and it leads to 36% less overall system power consumption. PMID:22254775

  1. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health. PMID:24792566

  2. Microfluidic solutions enabling continuous processing and monitoring of biological samples: A review.

    PubMed

    Karle, Marc; Vashist, Sandeep Kumar; Zengerle, Roland; von Stetten, Felix

    2016-07-27

    The last decade has witnessed tremendous advances in employing microfluidic solutions enabling Continuous Processing and Monitoring of Biological Samples (CPMBS), which is an essential requirement for the control of bio-processes. The microfluidic systems are superior to the traditional inline sensors due to their ability to implement complex analytical procedures, such as multi-step sample preparation, and enabling the online measurement of parameters. This manuscript provides a backgound review of microfluidic approaches employing laminar flow, hydrodynamic separation, acoustophoresis, electrophoresis, dielectrophoresis, magnetophoresis and segmented flow for the continuous processing and monitoring of biological samples. The principles, advantages and limitations of each microfluidic approach are described along with its potential applications. The challenges in the field and the future directions are also provided. PMID:27251944

  3. Evaluation of the effects of anatomic location, histologic processing, and sample size on shrinkage of skin samples obtained from canine cadavers.

    PubMed

    Reagan, Jennifer K; Selmic, Laura E; Garrett, Laura D; Singh, Kuldeep

    2016-09-01

    OBJECTIVE To evaluate effects of anatomic location, histologic processing, and sample size on shrinkage of excised canine skin samples. SAMPLE Skin samples from 15 canine cadavers. PROCEDURES Elliptical samples of the skin, underlying subcutaneous fat, and muscle fascia were collected from the head, hind limb, and lumbar region of each cadaver. Two samples (10 mm and 30 mm) were collected at each anatomic location of each cadaver (one from the left side and the other from the right side). Measurements of length, width, depth, and surface area were collected prior to excision (P1) and after fixation in neutral-buffered 10% formalin for 24 to 48 hours (P2). Length and width were also measured after histologic processing (P3). RESULTS Length and width decreased significantly at all anatomic locations and for both sample sizes at each processing stage. Hind limb samples had the greatest decrease in length, compared with results for samples obtained from other locations, across all processing stages for both sample sizes. The 30-mm samples had a greater percentage change in length and width between P1 and P2 than did the 10-mm samples. Histologic processing (P2 to P3) had a greater effect on the percentage shrinkage of 10-mm samples. For all locations and both sample sizes, percentage change between P1 and P3 ranged from 24.0% to 37.7% for length and 18.0% to 22.8% for width. CONCLUSIONS AND CLINICAL RELEVANCE Histologic processing, anatomic location, and sample size affected the degree of shrinkage of a canine skin sample from excision to histologic assessment. PMID:27580116

  4. Leukopak PBMC Sample Processing for Preparing Quality Control Material to Support Proficiency Testing Programs

    PubMed Central

    Garcia, Ambrosia; Keinonen, Sarah; Sanchez, Ana M.; Ferrari, Guido; Denny, Thomas N.; Moody, M. Anthony

    2014-01-01

    External proficiency testing programs designed to evaluate the performance of end-point laboratories involved in vaccine and therapeutic clinical trials form an important part of clinical trial quality assurance. Good Clinical Laboratory Practice (GCLP) guidelines recommend both assay validation and proficiency testing for assays being used in clinical trials, and such testing is facilitated by the availability of large numbers of well-characterized test samples. These samples can be distributed to laboratories participating in these programs and allow monitoring of laboratory performance over time and among participating sites when results are obtained with samples derived from a large master set. The leukapheresis procedure provides an ideal way to collect samples from participants that can meet the required number of cells to support these activities. The collection and processing of leukapheresis samples requires tight coordination between the clinical and laboratory teams to collect, process, and cryopreserve large number of samples within the established ideal time of ≤8 hours. Here, we describe our experience with a leukapheresis cryopreseration program that has been able to preserve the functionality of cellular subsets and that provides the sample numbers necessary to run an external proficiency testing program. PMID:24928650

  5. Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling

    PubMed Central

    Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.

    2012-01-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055

  6. Faulting processes in active faults - Evidences from TCDP and SAFOD drill core samples

    SciTech Connect

    Janssen, C.; Wirth, R.; Wenk, H. -R.; Morales, L.; Naumann, R.; Kienast, M.; Song, S. -R.; Dresen, G.

    2014-08-20

    The microstructures, mineralogy and chemistry of representative samples collected from the cores of the San Andreas Fault drill hole (SAFOD) and the Taiwan Chelungpu-Fault Drilling project (TCDP) have been studied using optical microscopy, TEM, SEM, XRD and XRF analyses. SAFOD samples provide a transect across undeformed host rock, the fault damage zone and currently active deforming zones of the San Andreas Fault. TCDP samples are retrieved from the principal slip zone (PSZ) and from the surrounding damage zone of the Chelungpu Fault. Substantial differences exist in the clay mineralogy of SAFOD and TCDP fault gouge samples. Amorphous material has been observed in SAFOD as well as TCDP samples. In line with previous publications, we propose that melt, observed in TCDP black gouge samples, was produced by seismic slip (melt origin) whereas amorphous material in SAFOD samples was formed by comminution of grains (crush origin) rather than by melting. Dauphiné twins in quartz grains of SAFOD and TCDP samples may indicate high seismic stress. The differences in the crystallographic preferred orientation of calcite between SAFOD and TCDP samples are significant. Microstructures resulting from dissolution–precipitation processes were observed in both faults but are more frequently found in SAFOD samples than in TCDP fault rocks. As already described for many other fault zones clay-gouge fabrics are quite weak in SAFOD and TCDP samples. Clay-clast aggregates (CCAs), proposed to indicate frictional heating and thermal pressurization, occur in material taken from the PSZ of the Chelungpu Fault, as well as within and outside of the SAFOD deforming zones, indicating that these microstructures were formed over a wide range of slip rates.

  7. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling.

    PubMed

    Boone, Eric J; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B; Stirm, Brian H; Pratt, Kerri A

    2015-07-21

    Cloudwater and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry (HRMS) with nanospray desorption electrospray ionization (nano-DESI) and direct infusion electrospray ionization (ESI) were utilized to compare the organic composition of the particle and cloudwater samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloudwater, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloudwater samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloudwater when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol. PMID:26068538

  8. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling

    SciTech Connect

    Boone, Eric J.; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B.; Stirm, Brian H.; Pratt, Kerri A.

    2015-07-21

    Cloud water and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry with nanospray desorption electrospray ionization and direct infusion electrospray ionization were utilized to compare the organic composition of the particle and cloud water samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloud water, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloud water samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloud water when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol.

  9. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF).

    PubMed

    Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S

    2012-02-23

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame. PMID:24027619

  10. Graphics processing unit (GPU) implementation of image processing algorithms to improve system performance of the control acquisition, processing, and image display system (CAPIDS) of the micro-angiographic fluoroscope (MAF)

    NASA Astrophysics Data System (ADS)

    Swetadri Vasan, S. N.; Ionita, Ciprian N.; Titus, A. H.; Cartwright, A. N.; Bednarek, D. R.; Rudin, S.

    2012-03-01

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  11. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF)

    PubMed Central

    Vasan, S.N. Swetadri; Ionita, Ciprian N.; Titus, A.H.; Cartwright, A.N.; Bednarek, D.R.; Rudin, S.

    2012-01-01

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame. PMID:24027619

  12. Relation of Childhood Worry to Information-Processing Factors in an Ethnically Diverse Community Sample

    ERIC Educational Resources Information Center

    Suarez-Morales, Lourdes; Bell, Debora

    2006-01-01

    This study examined information-processing variables in relation to worry in a sample of 292 fifth-grade children from Caucasian, African American, and Hispanic backgrounds. Results revealed that worry was related to threat interpretations for hypothetical situations and, when stress level was not controlled, to higher estimates of future…

  13. The Input and Process Batteries for MISOE [Management Information System for Occupational Education] Sample Data Systems.

    ERIC Educational Resources Information Center

    Weinberger, Elizabeth

    The document contains optical scannable forms for some of the instruments in the Input and Process Batteries, and guidelines for administration of the instruments in the Input Batteries of the Management Information System for Occupational Education (MISOE) Sample Data Systems. Input information describes the characteristics of the students at…

  14. A real-time imaging system for rapid processing of radioactive DNA samples

    NASA Astrophysics Data System (ADS)

    McGann, W. J.; McConchie, L.; Entine, G.

    1990-12-01

    A new, high-resolution nuclear-imaging detector system is described which substantially improves the speed of detection of radioactively labeled DNA samples. Ultimately this system will be made compatible with a fully automated DNA processing system to aid in the isolation and harvesting of DNA clones in the human genome.

  15. A laser microdissection-based workflow for FFPE tissue microproteomics: Important considerations for small sample processing.

    PubMed

    Longuespée, Rémi; Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Mazzucchelli, Gabriel; Baiwir, Dominique; Kriegsmann, Mark; Herfs, Michael; Kriegsmann, Jörg; Delvenne, Philippe; De Pauw, Edwin

    2016-07-15

    Proteomic methods are today widely applied to formalin-fixed paraffin-embedded (FFPE) tissue samples for several applications in research, especially in molecular pathology. To date, there is an unmet need for the analysis of small tissue samples, such as for early cancerous lesions. Indeed, no method has yet been proposed for the reproducible processing of small FFPE tissue samples to allow biomarker discovery. In this work, we tested several procedures to process laser microdissected tissue pieces bearing less than 3000 cells. Combined with appropriate settings for liquid chromatography mass spectrometry-mass spectrometry (LC-MS/MS) analysis, a citric acid antigen retrieval (CAAR)-based procedure was established, allowing to identify more than 1400 proteins from a single microdissected breast cancer tissue biopsy. This work demonstrates important considerations concerning the handling and processing of laser microdissected tissue samples of extremely limited size, in the process opening new perspectives in molecular pathology. A proof of the proposed method for biomarker discovery, with respect to these specific handling considerations, is illustrated using the differential proteomic analysis of invasive breast carcinoma of no special type and invasive lobular triple-negative breast cancer tissues. This work will be of utmost importance for early biomarker discovery or in support of matrix-assisted laser desorption/ionization (MALDI) imaging for microproteomics from small regions of interest. PMID:26690073

  16. Sampling and Hydrogeology of the Vadose Zone Beneath the 300 Area Process Ponds

    SciTech Connect

    Bjornstad, Bruce N.

    2004-08-31

    Four open pits were dug with a backhoe into the vadose zone beneath the former 300 Area Process Ponds in April 2003. Samples were collected about every 2 feet for physical, chemical, and/or microbiological characterization. This reports presents a stratigraphic and geohydrologic summary of the four excavations.

  17. Rapid Sample Processing For LC-MS Based Quantitative Proteomics Using High Intensity Focused Ultrasounds

    SciTech Connect

    Lopez-Ferrer, Daniel; Heibeck, Tyler H.; Petritis, Konstantinos; Hixson, Kim K.; Qian, Weijun; Monroe, Matthew E.; Mayampurath, Anoop M.; Moore, Ronald J.; Belov, Mikhail E.; Camp, David G.; Smith, Richard D.

    2008-09-01

    A new sample processing workflow that uses high intensity focused ultrasound to rapidly reduce and alkylate cysteines, digest proteins and then label peptides with 18O was developed for quantitative proteomics applications. Each step was individually refined to minimize reaction times, peptide loses and undesired by-products or modifications. By using this novel workflow, mouse plasma proteins were successfully denatured, alkylated, in-solution digested, and 18O labelled in < 10 min for subsequent analysis by liquid chromatography-electrospray ionization high resolution mass spectrometry. Performance was evaluated in terms of the number of mouse plasma peptides and proteins identified in a shotgun approach and the quantitative dynamic range. The results were compared with previously published results obtained using conventional sample preparation methods and were found to be similar. Advantages of the new method include greatly simplified and accelerated sample processing, as well as being readily amenable to automation.

  18. Rapid Sample Processing For LC-MS Based Quantitative Proteomics Using High Intensity Focused Ultrasounds

    PubMed Central

    López-Ferrer, Daniel; Heibeck, Tyler H.; Petritis, Konstantinos; Hixson, Kim K.; Qian, Weijun; Monroe, Matthew E.; Mayampurath, Anoop; Moore, Ronald J.; Belov, Mikhail E.; Camp, David G.; Smith, Richard D.

    2009-01-01

    A new sample processing workflow that uses high intensity focused ultrasound to rapidly reduce and alkylate cysteines, digest proteins and then label peptides with 18O was developed for quantitative proteomics applications. Each step was individually refined to minimize reaction times, peptide loses and undesired by-products or modifications. By using this novel workflow, mouse plasma proteins were successfully denatured, alkylated, in-solution digested, and 18O labelled in < 10 min for subsequent analysis by liquid chromatography-electrospray ionization high resolution mass spectrometry. Performance was evaluated in terms of the number of mouse plasma peptides and proteins identified in a shotgun approach and the quantitative dynamic range. The results were compared with previously published results obtained using conventional sample preparation methods and were found to be similar. Advantages of the new method include greatly simplified and accelerated sample processing, as well as being readily amenable to automation. PMID:18686986

  19. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings

    PubMed Central

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-01-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/− 2°C following the ramp up. The system is demonstrated to provide linear results between 104 and 108 CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636

  20. Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts.

    PubMed

    Brønd, Jan Christian; Arvidsson, Daniel

    2016-02-01

    ActiGraph acceleration data are processed through several steps (including band-pass filtering to attenuate unwanted signal frequencies) to generate the activity counts commonly used in physical activity research. We performed three experiments to investigate the effect of sampling frequency on the generation of activity counts. Ideal acceleration signals were produced in the MATLAB software. Thereafter, ActiGraph GT3X+ monitors were spun in a mechanical setup. Finally, 20 subjects performed walking and running wearing GT3X+ monitors. Acceleration data from all experiments were collected with different sampling frequencies, and activity counts were generated with the ActiLife software. With the default 30-Hz (or 60-Hz, 90-Hz) sampling frequency, the generation of activity counts was performed as intended with 50% attenuation of acceleration signals with a frequency of 2.5 Hz by the signal frequency band-pass filter. Frequencies above 5 Hz were eliminated totally. However, with other sampling frequencies, acceleration signals above 5 Hz escaped the band-pass filter to a varied degree and contributed to additional activity counts. Similar results were found for the spinning of the GT3X+ monitors, although the amount of activity counts generated was less, indicating that raw data stored in the GT3X+ monitor is processed. Between 600 and 1,600 more counts per minute were generated with the sampling frequencies 40 and 100 Hz compared with 30 Hz during running. Sampling frequency affects the processing of ActiGraph acceleration data to activity counts. Researchers need to be aware of this error when selecting sampling frequencies other than the default 30 Hz. PMID:26635347

  1. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings.

    PubMed

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-05-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/- 2°C following the ramp up. The system is demonstrated to provide linear results between 10(4) and 10(8) CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636

  2. Proposal for field sampling of plants and processing in the lab for environmental metabolic fingerprinting

    PubMed Central

    2010-01-01

    Background Samples for plant metabolic fingerprinting are prepared generally by metabolism quenching, grinding of plant material and extraction of metabolites in solvents. Further concentration and derivatisation steps follow in dependence of the sample nature and the available analytical platform. For plant material sampled in the field, several methods are not applicable, such as, e.g., collection in liquid nitrogen. Therefore, a protocol was established for sample pre-treatment, grinding, extraction and storage, which can be used for analysis of field-collected plant material, which is further processed in the laboratory. Ribwort plantain (Plantago lanceolata L., Plantaginaceae) was used as model plant. The quality criteria for method suitability were high reproducibility, extraction efficiency and handling comfort of each subsequent processing step. Results Highest reproducibility of results was achieved by sampling fresh plant material in a solvent mixture of methanol:dichloromethane (2:1), crushing the tissue with a hand-held disperser and storing the material until further processing. In the laboratory the material was extracted threefold at different pH. The gained extracts were separated with water (2:1:1 methanol:dichloromethane:water) and the aqueous phases used for analysis by LC-MS, because the polar metabolites were in focus. Chromatograms were compared by calculating a value Ξ for similarities. Advantages and disadvantages of different sample pre-treatment methods, use of solvents and solvent mixtures, influence of pH, extraction frequency and duration, and storing temperature are discussed with regard to the quality criteria. Conclusions The proposed extraction protocol leads to highly reproducible metabolic fingerprints and allows optimal handling of field-collected plant material and further processing in the laboratory, which is demonstrated for an exemplary field data-set. Calculation of Ξ values is a useful tool to judge similarities between

  3. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    PubMed

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array. PMID:26329245

  4. Pesticide-sampling equipment, sample-collection and processing procedures, and water-quality data at Chicod Creek, North Carolina, 1992

    USGS Publications Warehouse

    Manning, T.K.; Smith, K.E.; Wood, C.D.; Williams, J.B.

    1994-01-01

    Water-quality samples were collected from Chicod Creek in the Coastal Plain Province of North Carolina during the summer of 1992 as part of the U.S. Geological Survey's National Water-Quality Assessment Program. Chicod Creek is in the Albemarle-Pamlico drainage area, one of four study units designated to test equipment and procedures for collecting and processing samples for the solid-phase extraction of selected pesticides, The equipment and procedures were used to isolate 47 pesticides, including organonitrogen, carbamate, organochlorine, organophosphate, and other compounds, targeted to be analyzed by gas chromatography/mass spectrometry. Sample-collection and processing equipment equipment cleaning and set-up procedures, methods pertaining to collecting, splitting, and solid-phase extraction of samples, and water-quality data resulting from the field test are presented in this report Most problems encountered during this intensive sampling exercise were operational difficulties relating to equipment used to process samples.

  5. Statistical Review of Data from DWPF's Process Samples for Batches 19 Through 30

    SciTech Connect

    Edwards, T.B.

    1999-04-06

    The measurements derived from samples taken during the processing of batches 19 through 30 at the Defense Waste Processing Facility (DWPF) affords an opportunity for review and comparisons. This report has looked at some of the statistics from these data. Only the data reported by the DWPF lab (that is, the data provided by the lab as representative of the samples taken) are available for this analysis. In some cases, the sample results reported may be a subset of the sample results generated by the analytical procedures. A thorough assessment of the DWPF lab's analytical procedures would require the complete set of data. Thus, the statistics reported here, specifically, as they relate to analytical uncertainties, are limited to the reported data for these samples, A fell for the consistency of the incoming slurry is the estimation of the components of variation for the Sludge Receipt and Adjustment Tank (SRAT) receipts. In general, for all of the vessels, the data from batches after 21 show smaller batch-to-batch variation than the data from all the batches. The relative contributions of batch-to-batch versus residual, which includes analytical, are presented in these analyses.

  6. Technical note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, Tim; Bach, Lennart T.; Czerny, Jan; Riebesell, Ulf

    2016-05-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial and temporal decoupling between particle formation in the surface ocean and particle collection in sediment traps at depth often handicaps reconciliation of production and sedimentation even within the euphotic zone. Pelagic mesocosms are restricted to the surface ocean, but have the advantage of being closed systems and are therefore ideally suited to studying how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples for biogeochemical analysis. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently separated from bulk seawater by passive settling, centrifugation or flocculation with ferric chloride, and we discuss the advantages and efficiencies of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranged from fine to coarse silt (2-63 µm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements, and even at very low particle fluxes we were able to get a detailed insight into various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate the processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  7. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order

    PubMed Central

    Diederich, Adele; Oswald, Peter

    2014-01-01

    A sequential sampling model for multiattribute binary choice options, called multiattribute attention switching (MAAS) model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered—the attention time—influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability p0 > 0 of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process. PMID:25249963

  8. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol. PMID:18471209

  9. Effects of pre-analytical processes on blood samples used in metabolomics studies.

    PubMed

    Yin, Peiyuan; Lehmann, Rainer; Xu, Guowang

    2015-07-01

    Every day, analytical and bio-analytical chemists make sustained efforts to improve the sensitivity, specificity, robustness, and reproducibility of their methods. Especially in targeted and non-targeted profiling approaches, including metabolomics analysis, these objectives are not easy to achieve; however, robust and reproducible measurements and low coefficients of variation (CV) are crucial for successful metabolomics approaches. Nevertheless, all efforts from the analysts are in vain if the sample quality is poor, i.e. if preanalytical errors are made by the partner during sample collection. Preanalytical risks and errors are more common than expected, even when standard operating procedures (SOP) are used. This risk is particularly high in clinical studies, and poor sample quality may heavily bias the CV of the final analytical results, leading to disappointing outcomes of the study and consequently, although unjustified, to critical questions about the analytical performance of the approach from the partner who provided the samples. This review focuses on the preanalytical phase of liquid chromatography-mass spectrometry-driven metabolomics analysis of body fluids. Several important preanalytical factors that may seriously affect the profile of the investigated metabolome in body fluids, including factors before sample collection, blood drawing, subsequent handling of the whole blood (transportation), processing of plasma and serum, and inadequate conditions for sample storage, will be discussed. In addition, a detailed description of latent effects on the stability of the blood metabolome and a suggestion for a practical procedure to circumvent risks in the preanalytical phase will be given. PMID:25736245

  10. Rotorcraft flight dynamics and control in wind for autonomous sampling of spatiotemporal processes

    NASA Astrophysics Data System (ADS)

    Sydney, Nitin

    In recent years, there has been significant effort put into the design and use small, autonomous, multi-agent, aerial teams for a variety of military and commercial applications. In particular, small multi-rotor systems have been shown to be especially useful for carrying sensors as they have the ability to rapidly transit between locations as well as hover in place. This dissertation seeks to use multi-agent teams of autonomous rotorcraft to sample spatiotemporal fields in windy conditions. For many sampling objectives, there is the problem of how to accomplish the sampling objective in the presence of strong wind fields caused by external means or by other rotorcraft flying in close proximity. This dissertation develops several flight control strategies for both wind compensation, using nonlinear control techniques, and wind avoidance, using artificial potential-based control. To showcase the utility of teams of unmanned rotorcraft for spatiotemporal sampling, optimal algorithms are developed for two sampling objectives: (1) sampling continuous spatiotemporal fields modeled as Gaussian processes, and (2) optimal motion planning for coordinated target detection, which is an example of a discrete spatiotemporal field. All algorithms are tested in simulation and several are tested in a motion capture based experimental testbed.

  11. Data Acquisition for Modular Biometric Monitoring System

    NASA Technical Reports Server (NTRS)

    Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor); Grodsinsky, Carlos M. (Inventor)

    2014-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.

  12. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  13. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion method was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U data, which is also

  14. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  15. Towards a Theory of Sampled-Data Piecewise-Deterministic Markov Processes

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Gonzalez, Oscar R.; Gray, W. Steven

    2006-01-01

    The analysis and design of practical control systems requires that stochastic models be employed. Analysis and design tools have been developed, for example, for Markovian jump linear continuous and discrete-time systems, piecewise-deterministic processes (PDP's), and general stochastic hybrid systems (GSHS's). These model classes have been used in many applications, including fault tolerant control and networked control systems. This paper presents initial results on the analysis of a sampled-data PDP representation of a nonlinear sampled-data system with a jump linear controller. In particular, it is shown that the state of the sampled-data PDP satisfies the strong Markov property. In addition, a relation between the invariant measures of a sampled-data system driven by a stochastic process and its associated discrete-time representation are presented. As an application, when the plant is linear with no external input, a sufficient testable condition for the convergence in distribution to the invariant delta Dirac measure is given.

  16. Processes in scientific workflows for information seeking related to physical sample materials

    NASA Astrophysics Data System (ADS)

    Ramdeen, S.

    2014-12-01

    The majority of State Geological Surveys have repositories containing cores, cuttings, fossils or other physical sample material. State surveys maintain these collections to support their own research as well as the research conducted by external users from other organizations. This includes organizations such as government agencies (state and federal), academia, industry and the public. The preliminary results presented in this paper will look at the research processes of these external users. In particular: how they discover, access and use digital surrogates, which they use to evaluate and access physical items in these collections. Data such as physical samples are materials that cannot be completely replaced with digital surrogates. Digital surrogates may be represented as metadata, which enable discovery and ultimately access to these samples. These surrogates may be found in records, databases, publications, etc. But surrogates do not completely prevent the need for access to the physical item as they cannot be subjected to chemical testing and/or other similar analysis. The goal of this research is to document the various processes external users perform in order to access physical materials. Data for this study will be collected by conducting interviews with these external users. During the interviews, participants will be asked to describe the workflow that lead them to interact with state survey repositories, and what steps they took afterward. High level processes/categories of behavior will be identified. These processes will be used in the development of an information seeking behavior model. This model may be used to facilitate the development of management tools and other aspects of cyberinfrastructure related to physical samples.

  17. Microwave Processing for Sample Preparation to Evaluate Mitochondrial Ultrastructural Damage in Hemorrhagic Shock

    NASA Astrophysics Data System (ADS)

    Josephsen, Gary D.; Josephsen, Kelly A.; Beilman, Greg J.; Taylor, Jodie H.; Muiler, Kristine E.

    2005-12-01

    This is a report of the adaptation of microwave processing in the preparation of liver biopsies for transmission electron microscopy (TEM) to examine ultrastructural damage of mitochondria in the setting of metabolic stress. Hemorrhagic shock was induced in pigs via 35% total blood volume bleed and a 90-min period of shock followed by resuscitation. Hepatic biopsies were collected before shock and after resuscitation. Following collection, biopsies were processed for TEM by a rapid method involving microwave irradiation (Giberson, 2001). Samples pre- and postshock of each of two animals were viewed and scored using the mitochondrial ultrastructure scoring system (Crouser et al., 2002), a system used to quantify the severity of ultrastructural damage during shock. Results showed evidence of increased ultrastructural damage in the postshock samples, which scored 4.00 and 3.42, versus their preshock controls, which scored 1.18 and 1.27. The results of this analysis were similar to those obtained in another model of shock (Crouser et al., 2002). However, the amount of time used to process the samples was significantly shortened with methods involving microwave irradiation.

  18. Acquisition strategies

    SciTech Connect

    Zimmer, M.J.; Lynch, P.W. )

    1993-11-01

    Acquiring projects takes careful planning, research and consideration. Picking the right opportunities and avoiding the pitfalls will lead to a more valuable portfolio. This article describes the steps to take in evaluating an acquisition and what items need to be considered in an evaluation.

  19. New field programmable gate array-based image-oriented acquisition and real-time processing applied to plasma facing component thermal monitoring

    SciTech Connect

    Martin, V.; Dunand, G.; Moncada, V.; Jouve, M.; Travere, J.-M.

    2010-10-15

    During operation of present fusion devices, the plasma facing components (PFCs) are exposed to high heat fluxes. Understanding and preventing overheating of these components during long pulse discharges is a crucial safety issue for future devices like ITER. Infrared digital cameras interfaced with complex optical systems have become a routine diagnostic to measure surface temperatures in many magnetic fusion devices. Due to the complexity of the observed scenes and the large amount of data produced, the use of high computational performance hardware for real-time image processing is then mandatory to avoid PFC damages. At Tore Supra, we have recently made a major upgrade of our real-time infrared image acquisition and processing board by the use of a new field programmable gate array (FPGA) optimized for image processing. This paper describes the new possibilities offered by this board in terms of image calibration and image interpretation (abnormal thermal events detection) compared to the previous system.

  20. Evaluation of SRAT Sampling Data in Support of a Six Sigma Yellow Belt Process Improvement Project

    SciTech Connect

    Edwards, Thomas B.

    2005-06-01

    As part of the Six Sigma continuous improvement initiatives at the Defense Waste Processing Facility (DWPF), a Yellow Belt team was formed to evaluate the frequency and types of samples required for the Sludge Receipt and Adjustment Tank (SRAT) receipt in the DWPF. The team asked, via a technical task request, that the Statistical Consulting Section (SCS), in concert with the Immobilization Technology Section (ITS) (both groups within the Savannah River National Laboratory (SRNL)), conduct a statistical review of recent SRAT receipt results to determine if there is enough consistency in these measurements to allow for less frequent sampling. As part of this review process, key decisions made by DWPF Process Engineering that are based upon the SRAT sample measurements are outlined in this report. For a reduction in SRAT sampling to be viable, these decisions must not be overly sensitive to the additional variation that will be introduced as a result of such a reduction. Measurements from samples of SRAT receipt batches 314 through 323 were reviewed as part of this investigation into the frequency of SRAT sampling. The associated acid calculations for these batches were also studied as part of this effort. The results from this investigation showed no indication of a statistically significant relationship between the tank solids and the acid additions for these batches. One would expect that as the tank solids increase there would be a corresponding increase in acid requirements. There was, however, an indication that the predicted reduction/oxidation (REDOX) ratio (the ratio of Fe{sup 2+} to the total Fe in the glass product) that was targeted by the acid calculations based on the SRAT receipt samples for these batches was on average 0.0253 larger than the predicted REDOX based upon Slurry Mix Evaporator (SME) measurements. This is a statistically significant difference (at the 5% significance level), and the study also suggested that the difference was due to

  1. Post-acquisition data processing for the screening of transformation products of different organic contaminants. Two-year monitoring of river water using LC-ESI-QTOF-MS and GCxGC-EI-TOF-MS.

    PubMed

    López, S Herrera; Ulaszewska, M M; Hernando, M D; Martínez Bueno, M J; Gómez, M J; Fernández-Alba, A R

    2014-11-01

    This study describes a comprehensive strategy for detecting and elucidating the chemical structures of expected and unexpected transformation products (TPs) from chemicals found in river water and effluent wastewater samples, using liquid chromatography coupled to electrospray ionization quadrupole-time-of-flight mass spectrometer (LC-ESI-QTOF-MS), with post-acquisition data processing and an automated search using an in-house database. The efficacy of the mass defect filtering (MDF) approach to screen metabolites from common biotransformation pathways was tested, and it was shown to be sufficiently sensitive and applicable for detecting metabolites in environmental samples. Four omeprazole metabolites and two venlafaxine metabolites were identified in river water samples. This paper reports the analytical results obtained during 2 years of monitoring, carried out at eight sampling points along the Henares River (Spain). Multiresidue monitoring, for targeted analysis, includes a group of 122 chemicals, amongst which are pharmaceuticals, personal care products, pesticides and PAHs. For this purpose, two analytical methods were used based on direct injection with a LC-ESI-QTOF-MS system and stir bar sorptive extraction (SBSE) with bi-dimensional gas chromatography coupled with a time-of-flight spectrometer (GCxGC-EI-TOF-MS). PMID:24952251

  2. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    SciTech Connect

    Tripp, J.; Smith, T.; Law, J.

    2013-07-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  3. Endophytic bacterial community of grapevine leaves influenced by sampling date and phytoplasma infection process

    PubMed Central

    2014-01-01

    Background Endophytic bacteria benefit host plant directly or indirectly, e.g. by biocontrol of the pathogens. Up to now, their interactions with the host and with other microorganisms are poorly understood. Consequently, a crucial step for improving the knowledge of those relationships is to determine if pathogens or plant growing season influence endophytic bacterial diversity and dynamic. Results Four healthy, four phytoplasma diseased and four recovered (symptomatic plants that spontaneously regain a healthy condition) grapevine plants were sampled monthly from June to October 2010 in a vineyard in north-western Italy. Metagenomic DNA was extracted from sterilized leaves and the endophytic bacterial community dynamic and diversity were analyzed by taxon specific real-time PCR, Length-Heterogeneity PCR and genus-specific PCR. These analyses revealed that both sampling date and phytoplasma infection influenced the endophytic bacterial composition. Interestingly, in June, when the plants are symptomless and the pathogen is undetectable (i) the endophytic bacterial community associated with diseased grapevines was different from those in the other sampling dates, when the phytoplasmas are detectable inside samples; (ii) the microbial community associated with recovered plants differs from that living inside healthy and diseased plants. Interestingly, LH-PCR database identified bacteria previously reported as biocontrol agents in the examined grapevines. Of these, Burkholderia, Methylobacterium and Pantoea dynamic was influenced by the phytoplasma infection process and seasonality. Conclusion Results indicated that endophytic bacterial community composition in grapevine is correlated to both phytoplasma infection and sampling date. For the first time, data underlined that, in diseased plants, the pathogen infection process can decrease the impact of seasonality on community dynamic. Moreover, based on experimental evidences, it was reasonable to hypothesize that

  4. [Optimization of processing and storage of clinical samples to be used for the molecular diagnosis of pertussis].

    PubMed

    Pianciola, L; Mazzeo, M; Flores, D; Hozbor, D

    2010-01-01

    Pertussis or whooping cough is an acute, highly contagious respiratory infection, which is particularly severe in infants under one year old. In classic disease, clinical diagnosis may present no difficulties. In other cases, it requires laboratory confirmation. Generally used methods are: culture, serology and PCR. For the latter, the sample of choice is a nasopharyngeal aspirate, and the simplest method for processing these samples uses proteinase K. Although results are generally satisfactory, difficulties often arise regarding the mucosal nature of the specimens. Moreover, uncertainties exist regarding the optimal conditions for sample storage. This study evaluated various technologies for processing and storing samples. Results enabled us to select a method for optimizing sample processing, with performance comparable to commercial methods and far lower costs. The experiments designed to assess the conservation of samples enabled us to obtain valuable information to guide the referral of samples from patient care centres to laboratories where such samples are processed by molecular methods. PMID:20589331

  5. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  6. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  7. Frequency Effects in Language Processing: A Review with Implications for Theories of Implicit and Explicit Language Acquisition.

    ERIC Educational Resources Information Center

    Ellis, Nick C.

    2002-01-01

    Shows how language processing is intimately tuned to input frequency. Examples are given of frequency effects in the processing of phonology, phonotactics, reading, spelling, lexis, morphosyntax, formulaic language, language comprehension, grammaticality, sentence production, and syntax. (Author/VWL)

  8. Planetary protection, legal ambiguity and the decision making process for Mars sample return

    NASA Astrophysics Data System (ADS)

    Race, M. S.

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under NEPA; uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertainties about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  9. Planetary protection, legal ambiguity and the decision making process for Mars sample return

    NASA Technical Reports Server (NTRS)

    Race, M. S.

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  10. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions. PMID:11538983

  11. Does sample rate introduce an artifact in spectral analysis of continuous processes?

    PubMed

    Wijnants, Maarten L; Cox, R F A; Hasselman, F; Bosman, A M T; Van Orden, Guy

    2012-01-01

    Spectral analysis is a widely used method to estimate 1/f(α) noise in behavioral and physiological data series. The aim of this paper is to achieve a more solid appreciation for the effects of periodic sampling on the outcomes of spectral analysis. It is shown that spectral analysis is biased by the choice of sample rate because denser sampling comes with lower amplitude fluctuations at the highest frequencies. Here we introduce an analytical strategy that compensates for this effect by focusing on a fixed amount, rather than a fixed percentage of the lowest frequencies in a power spectrum. Using this strategy, estimates of the degree of 1/f(α) noise become robust against sample rate conversion and more sensitive overall. Altogether, the present contribution may shed new light on known discrepancies in the psychological literature on 1/f(α) noise, and may provide a means to achieve a more solid framework for 1/f(α) noise in continuous processes. PMID:23346058

  12. The effect of instruction by a professional scientist on the acquisition of integrated process skills and the science-related attitudes of eighth-grade students

    NASA Astrophysics Data System (ADS)

    Owens, Katharine Donner

    This study investigated the effect of instruction by a professional scientist on the acquisition of science integrated process skills and the science-related attitudes of eighth-grade students. Eighty-two students from four intact classes in south Mississippi junior high schools participated in this study. Two experimental groups were taught a problem solving curriculum over a six week period by professional chemists; one experimental group had an additional six weeks of instruction by a professional engineer. Two control groups had science instruction by their classroom teachers. Homogeneity of the groups related to basic skills and science attitudes was determined and students drew their perception of a scientist before any instruction began. At the end of the intervention period students in all groups were given the Test of Science-Related Attitudes, the Test of Integrated Process Skills II, and a Draw-A-Scientist Test. The statistical procedures of the Wilks Lambda MANOVA, a univariate post hoc test, a split plot analysis of variance, and a one-way analysis of variance were used to test the hypotheses at the 0.05 significance level. Students' drawings of scientists were analyzed for the presence of stereotypic characteristics. Scores on all tests were analyzed according to gender and to group membership. There was a statistically significant difference in the science-related attitudes and the acquisition of science process skills between treatment groups. The experimental group taught by a professional chemist for six weeks scored higher on the test of process skills and had more positive attitudes toward careers in science and the normality of scientists than the control groups. There was a significant decline in stereotypic characteristics seen in the drawings of scientists by students who had longer instruction by two professional scientists. There was no statistically significant difference between male and female students and no interaction effect between

  13. Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing

    NASA Technical Reports Server (NTRS)

    Selzer, Robert H. (Inventor); Hodis, Howard N. (Inventor)

    2011-01-01

    A standardized acquisition methodology assists operators to accurately replicate high resolution B-mode ultrasound images obtained over several spaced-apart examinations utilizing a split-screen display in which the arterial ultrasound image from an earlier examination is displayed on one side of the screen while a real-time "live" ultrasound image from a current examination is displayed next to the earlier image on the opposite side of the screen. By viewing both images, whether simultaneously or alternately, while manually adjusting the ultrasound transducer, an operator is able to bring into view the real-time image that best matches a selected image from the earlier ultrasound examination. Utilizing this methodology, dynamic material properties of arterial structures, such as IMT and diameter, are measured in a standard region over successive image frames. Each frame of the sequence has its echo edge boundaries automatically determined by using the immediately prior frame's true echo edge coordinates as initial boundary conditions. Computerized echo edge recognition and tracking over multiple successive image frames enhances measurement of arterial diameter and IMT and allows for improved vascular dimension measurements, including vascular stiffness and IMT determinations.

  14. Acquisition and integration of low vision assistive devices: understanding the decision-making process of older adults with low vision.

    PubMed

    Copolillo, Al; Teitelman, Jodi L

    2005-01-01

    The purpose of this study was to describe how older adults with low vision make decisions to use low vision assistive devices (LVADs). Analysis of participants' narratives, from both group and individual interviews, revealed three topic areas affecting device use. Two are discussed in this paper: Experiences and Characteristics Leading to Successful LVAD Use Decision Making and Challenges to Successful LVAD Use Decision Making. The third, Adjustment to Low Vision Disability, is briefly discussed. Of particular importance to occupational therapy practitioners in the growing field of low vision rehabilitation was the value placed on low vision rehabilitation services to assist with acquiring devices and integrating them into daily routines. Occupational therapy services were highly regarded. Participants demonstrated the importance of becoming a part of a supportive network of people with low vision to gain access to information about resources. They emphasized the need for systems and policy changes to reduce barriers to making informed decisions about LVAD use. Results indicate that occupational therapists working in low vision can support clients by facilitating development of a support network, acting as liaisons between clients and other health practitioners, especially ophthalmologists, and encouraging policy development that supports barrier-free LVAD acquisition and use. These topics should be incorporated into continuing and entry-level education to prepare practitioners for leadership in the field of low vision rehabilitation. PMID:15969278

  15. Disrupting morphosyntactic and lexical semantic processing has opposite effects on the sample entropy of neural signals.

    PubMed

    Fonseca, André; Boboeva, Vezha; Brederoo, Sanne; Baggio, Giosuè

    2015-04-16

    Converging evidence in neuroscience suggests that syntax and semantics are dissociable in brain space and time. However, it is possible that partly disjoint cortical networks, operating in successive time frames, still perform similar types of neural computations. To test the alternative hypothesis, we collected EEG data while participants read sentences containing lexical semantic or morphosyntactic anomalies, resulting in N400 and P600 effects, respectively. Next, we reconstructed phase space trajectories from EEG time series, and we measured the complexity of the resulting dynamical orbits using sample entropy - an index of the rate at which the system generates or loses information over time. Disrupting morphosyntactic or lexical semantic processing had opposite effects on sample entropy: it increased in the N400 window for semantic anomalies, and it decreased in the P600 window for morphosyntactic anomalies. These findings point to a fundamental divergence in the neural computations supporting meaning and grammar in language. PMID:25634797

  16. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, J.; May, C.P.; Pemberton, B.E.; Shinn, J.; Sprague, K.

    1999-03-30

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus. 8 figs.

  17. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, Joseph; May, Christopher P.; Pemberton, Bradley E.; Shinn, Jim; Sprague, Keith

    1999-01-01

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus.

  18. Microstructural and magnetic analysis of a superconducting foam and comparison with IG-processed bulk samples

    NASA Astrophysics Data System (ADS)

    Koblischka-Veneva, A.; Koblischka, M. R.; Ide, N.; Inoue, K.; Muralidhar, M.; Hauet, T.; Murakami, M.

    2016-03-01

    YBa2Cu3Oy (YBCO) foam samples show an open, porous foam structure, which may have benefits for many applications of high-T c superconductors. As the basic material of these foams is a pseudo-single crystalline material with the directional growth initiated by a seed crystal similar to standard melt-textured samples, the achieved texture of the YBCO is a very important parameter. We analyzed the local texture and grain orientation of the individual struts forming the foam by means of atomic force microscopy and electron backscatter diffraction (EBSD). Furthermore, the magnetic properties of a foam strut are evaluated by means of SQUID measurements, from which the flux pinning forces were determined. A scaling of the pinning forces in the temperature range between 60 K and 85 K was performed. These data and the details of the microstructure are compared to IG-processed, bulk material.

  19. Microwave irradiation for shortening the processing time of samples of flagellated bacteria for scanning electron microscopy.

    PubMed

    Hernández-Chavarría, Francisco

    2004-01-01

    Microwave irradiation (MWI) has been applied to the development of rapid methods to process biological samples for scanning electron microscopy (SEM). In this paper we propose two simple and quick techniques for processing bacteria (Proteus mirabilis and Vibrio mimicus) for SEM using MWI. In the simplest methodology, the bacteria were placed on a cover-glass, air-dried, and submitted to conductivity stain. The reagent used for the conductivity stain was the mordant of a light microscopy staining method (10 ml of 5% carbolic acid solution, 2 g of tannic acid, and 10 ml of saturated aluminum sulfate 12-H2O). In the second method the samples were double fixed (glutaraldehyde and then osmium), submitted to conductivity stain, dehydrated through a series of ethanol solutions of increasing concentration, treated with hexamethyldisilazine (HMDS), and dried at 35 degrees C for 5 minutes. In both methods the steps from fixation to treatment with HMDS were done under MWI for 2 minutes in an ice-water bath, in order to dissipate the heat generated by the MWI. Although both techniques preserve bacterial morphology adequately, the latter, technique showed the best preservation, including the appearance of flagella, and that process was completed in less than 2 hours at temperatures of MWI between 4 to 5 degrees C. PMID:17061527

  20. Chemical process to separate iron oxides particles in pottery sample for EPR dating.

    PubMed

    Watanabe, S; Farias, T M B; Gennari, R F; Ferraz, G M; Kunzli, R; Chubaci, J F D

    2008-12-15

    Ancient potteries usually are made of the local clay material, which contains relatively high concentration of iron. The powdered samples are usually quite black, due to magnetite, and, although they can be used for thermoluminescene (TL) dating, it is easiest to obtain better TL reading when clearest natural or pre-treated sample is used. For electron paramagnetic resonance (EPR) measurements, the huge signal due to iron spin-spin interaction, promotes an intense interference overlapping any other signal in this range. Sample dating is obtained by dividing the radiation dose, determined by the concentration of paramagnetic species generated by irradiation, by the natural dose so as a consequence, EPR dating cannot be used, since iron signal do not depend on radiation dose. In some cases, the density separation method using hydrated solution of sodium polytungstate [Na6(H2W12O40).H2O] becomes useful. However, the sodium polytungstate is very expensive in Brazil; hence an alternative method for eliminating this interference is proposed. A chemical process to eliminate about 90% of magnetite was developed. A sample of powdered ancient pottery was treated in a mixture (3:1:1) of HCl, HNO(3) and H(2)O(2) for 4h. After that, it was washed several times in distilled water to remove all acid matrixes. The original black sample becomes somewhat clearer. The resulting material was analyzed by plasma mass spectrometry (ICP-MS), with the result that the iron content is reduced by a factor of about 9. In EPR measurements a non-treated natural ceramic sample shows a broad spin-spin interaction signal, the chemically treated sample presents a narrow signal in g=2.00 region, possibly due to a radical of (SiO(3))(3-), mixed with signal of remaining iron [M. Ikeya, New Applications of Electron Spin Resonance, World Scientific, Singapore, 1993, p. 285]. This signal increases in intensity under gamma-irradiation. However, still due to iron influence, the additive method yielded too

  1. Following Native Language Acquisition.

    ERIC Educational Resources Information Center

    Neiburg, Michael S.

    Native language acquisition is a natural and non-natural stage-by-stage process. The natural first stage is development of speech and listening skills. In this stage, competency is gained in the home environment. The next, non-natural stage is development of literacy, a cultural skill taught in school. Since oral-aural native language development…

  2. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome

    PubMed Central

    Esteves, Ana I. S.; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viability and recovery in culture. We isolated 325 sponge bacteria from six specimens of Cymbastela concentrica and three specimens of Scopalina sp. These isolates were distributed over 37 different genera and 47 operational taxonomic units (defined at 97% 16S rRNA gene sequence identity). The cultivable bacterial community was highly specific to its sponge host and different media compositions yielded distinct microbial isolates. Around 97% of the isolates could be detected in the original sponge and represented a large but highly variable proportion (0.5–92% total abundance, depending on sponge species) of viable bacteria obtained after sample processing, as determined by propidium monoazide selective DNA modification of compromised cells. Our results show that the most abundant viable bacteria are also the most predominant groups found in cultivation, reflecting, to some extent, the relative abundances of the viable bacterial community, rather than the overall community estimated by direct molecular approaches. Cultivation is therefore shaped not only by the growth conditions provided, but also by the different cell viabilities of the bacteria that constitute the cultivation inoculum. These observations highlight the need to perform experiments to assess each method of sample processing for its accurate representation of the actual in situ bacterial community and its yield of viable cells. PMID:27242673

  3. L. monocytogenes in a cheese processing facility: Learning from contamination scenarios over three years of sampling.

    PubMed

    Rückerl, I; Muhterem-Uyar, M; Muri-Klinger, S; Wagner, K-H; Wagner, M; Stessl, B

    2014-10-17

    The aim of this study was to analyze the changing patterns of Listeria monocytogenes contamination in a cheese processing facility manufacturing a wide range of ready-to-eat products. Characterization of L. monocytogenes isolates included genotyping by pulsed-field gel electrophoresis (PFGE) and multi-locus sequence typing (MLST). Disinfectant-susceptibility tests and the assessment of L. monocytogenes survival in fresh cheese were also conducted. During the sampling period between 2010 and 2013, a total of 1284 environmental samples were investigated. Overall occurrence rates of Listeria spp. and L. monocytogenes were 21.9% and 19.5%, respectively. Identical L. monocytogenes genotypes were found in the food processing environment (FPE), raw materials and in products. Interventions after the sampling events changed contamination scenarios substantially. The high diversity of globally, widely distributed L. monocytogenes genotypes was reduced by identifying the major sources of contamination. Although susceptible to a broad range of disinfectants and cleaners, one dominant L. monocytogenes sequence type (ST) 5 could not be eradicated from drains and floors. Significantly, intense humidity and steam could be observed in all rooms and water residues were visible on floors due to increased cleaning strategies. This could explain the high L. monocytogenes contamination of the FPE (drains, shoes and floors) throughout the study (15.8%). The outcome of a challenge experiment in fresh cheese showed that L. monocytogenes could survive after 14days of storage at insufficient cooling temperatures (8 and 16°C). All efforts to reduce L. monocytogenes environmental contamination eventually led to a transition from dynamic to stable contamination scenarios. Consequently, implementation of systematic environmental monitoring via in-house systems should either aim for total avoidance of FPE colonization, or emphasize a first reduction of L. monocytogenes to sites where

  4. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome.

    PubMed

    Esteves, Ana I S; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viability and recovery in culture. We isolated 325 sponge bacteria from six specimens of Cymbastela concentrica and three specimens of Scopalina sp. These isolates were distributed over 37 different genera and 47 operational taxonomic units (defined at 97% 16S rRNA gene sequence identity). The cultivable bacterial community was highly specific to its sponge host and different media compositions yielded distinct microbial isolates. Around 97% of the isolates could be detected in the original sponge and represented a large but highly variable proportion (0.5-92% total abundance, depending on sponge species) of viable bacteria obtained after sample processing, as determined by propidium monoazide selective DNA modification of compromised cells. Our results show that the most abundant viable bacteria are also the most predominant groups found in cultivation, reflecting, to some extent, the relative abundances of the viable bacterial community, rather than the overall community estimated by direct molecular approaches. Cultivation is therefore shaped not only by the growth conditions provided, but also by the different cell viabilities of the bacteria that constitute the cultivation inoculum. These observations highlight the need to perform experiments to assess each method of sample processing for its accurate representation of the actual in situ bacterial community and its yield of viable cells. PMID:27242673

  5. Directionally Solidified Aluminum - 7 wt% Silicon Alloys: Comparison of Earth and International Space Station Processed Samples

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N,; Tewari, Surendra; Rajamure, R. S.; Erdman, Robert; Poirier, David

    2012-01-01

    Primary dendrite arm spacings of Al-7 wt% Si alloy directionally solidified in low gravity environment of space (MICAST-6 and MICAST-7: Thermal gradient approx. 19 to 26 K/cm, Growth speeds varying from 5 to 50 microns/s show good agreement with the Hunt-Lu model. Primary dendrite trunk diameters of the ISS processed samples show a good fit with a simple analytical model based on Kirkwood s approach, proposed here. Natural convection, a) decreases primary dendrite arm spacing. b) appears to increase primary dendrite trunk diameter.

  6. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-07-25

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  7. Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations

    NASA Technical Reports Server (NTRS)

    Christoffersen, Roy; Keller, L. P.

    2014-01-01

    Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.

  8. Contribution of working memory processes to relational matching-to-sample performance in baboons (Papio papio).

    PubMed

    Maugard, Anaïs; Marzouki, Yousri; Fagot, Joël

    2013-11-01

    Recent studies of monkeys and apes have shown that these animals can solve relational-matching-to-sample (RMTS) problems, suggesting basic abilities for analogical reasoning. However, doubts remain as to the actual cognitive strategies adopted by nonhuman primates in this task. Here, we used dual-task paradigms to test 10 baboons in the RMTS problem under three conditions of memory load. Our three test conditions allowed different predictions, depending on the strategy (i.e., flat memorization of the percept, reencoding of the percept, or relational processing) that they might use to solve RMTS problems. Results support the idea that the baboons process both the items and the abstract (same and different) relations in this task. PMID:23772798

  9. Visual and Auditory Information Processing in Flying Skill Acquisition. Final Report for Period July 1973 through June 1974.

    ERIC Educational Resources Information Center

    Leshowitz, Barry; And Others

    A series of experiments are described which were conducted to further refine experimental paradigms for the investigation of information processing skills relevant to pilot training. A series of tasks have been developed and studied which attempt to measure the individual's information processing capacity as well as his susceptibility to…

  10. How Students Learn: The Validation of a Model of Knowledge Acquisition Using Stimulated Recall of the Learning Process.

    ERIC Educational Resources Information Center

    Nuthall, Graham

    A study of students' thinking processes during their engagement in classroom tasks in science and social studies units in upper elementary school classrooms was conducted as part of a series of studies on learning. As a result of previous studies, a theory of the learning process has been developed. A central component of the theory is the…

  11. 77 FR 52258 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... a proposed rule in the Federal Register at 77 FR 2682 on January 19, 2012. The comment period closed... the Wide Area WorkFlow (WAWF) used to process vouchers. DATES: August 29, 2012. FOR FURTHER... rule merely updates DoD's voucher processing procedures and better accommodates the Wide Area...

  12. Bifidobacteria Enhance Antigen Sampling and Processing by Dendritic Cells in Pediatric Inflammatory Bowel Disease.

    PubMed

    Strisciuglio, Caterina; Miele, Erasmo; Giugliano, Francesca P; Vitale, Serena; Andreozzi, Marialuisa; Vitale, Alessandra; Catania, Maria R; Staiano, Annamaria; Troncone, Riccardo; Gianfrani, Carmen

    2015-07-01

    Bifidobacteria have been reported to reduce inflammation and contribute to intestinal homeostasis. However, the interaction between these bacteria and the gut immune system remains largely unknown. Because of the central role played by dendritic cells (DCs) in immune responses, we examined in vitro the effects of a Bifidobacteria mixture (probiotic) on DC functionality from children with inflammatory bowel disease. DCs obtained from peripheral blood monocytes of patients with Crohn's disease (CD), ulcerative colitis, and noninflammatory bowel disease controls (HC) were incubated with fluorochrome-conjugated particles of Escherichia coli or DQ-Ovalbumin (DQ-OVA) after a pretreatment with the probiotic, to evaluate DC phenotype, antigen sampling and processing. Moreover, cell supernatants were collected to measure tumor necrosis factor alpha, interferon gamma, interleukin 17, and interleukin 10 production by enzyme-linked immunosorbent assay. DCs from CD children showed a higher bacteria particles uptake and DQ-OVA processing after incubation with the probiotic; in contrast, DC from both ulcerative colitis and HC showed no significant changes. Moreover, a marked tumor necrosis factor alpha release was observed in DC from CD after exposure to E. coli particles, whereas the probiotic did not affect the production of this proinflammatory cytokine. In conclusion, the Bifidobacteria significantly improved the antigen uptake and processing by DCs from patients with CD, which are known to present an impaired autophagic functionality, whereas, in DCs from ulcerative colitis and HC, no prominent effect of probiotic mixture was observed. This improvement of antigen sampling and processing could partially solve the impairment of intestinal innate immunity and reduce uncontrolled microorganism growth in the intestine of children with inflammatory bowel disease. PMID:25895109

  13. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  14. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  15. Sampling strategies and post-processing methods for increasing the time resolution of organic aerosol measurements requiring long sample-collection times

    NASA Astrophysics Data System (ADS)

    Modini, Rob L.; Takahama, Satoshi

    2016-07-01

    The composition and properties of atmospheric organic aerosols (OAs) change on timescales of minutes to hours. However, some important OA characterization techniques typically require greater than a few hours of sample-collection time (e.g., Fourier transform infrared (FTIR) spectroscopy). In this study we have performed numerical modeling to investigate and compare sample-collection strategies and post-processing methods for increasing the time resolution of OA measurements requiring long sample-collection times. Specifically, we modeled the measurement of hydrocarbon-like OA (HOA) and oxygenated OA (OOA) concentrations at a polluted urban site in Mexico City, and investigated how to construct hourly resolved time series from samples collected for 4, 6, and 8 h. We modeled two sampling strategies - sequential and staggered sampling - and a range of post-processing methods including interpolation and deconvolution. The results indicated that relative to the more sophisticated and costly staggered sampling methods, linear interpolation between sequential measurements is a surprisingly effective method for increasing time resolution. Additional error can be added to a time series constructed in this manner if a suboptimal sequential sampling schedule is chosen. Staggering measurements is one way to avoid this effect. There is little to be gained from deconvolving staggered measurements, except at very low values of random measurement error (< 5 %). Assuming 20 % random measurement error, one can expect average recovery errors of 1.33-2.81 µg m-3 when using 4-8 h-long sequential and staggered samples to measure time series of concentration values ranging from 0.13-29.16 µg m-3. For 4 h samples, 19-47 % of this total error can be attributed to the process of increasing time resolution alone, depending on the method used, meaning that measurement precision would only be improved by 0.30-0.75 µg m-3 if samples could be collected over 1 h instead of 4 h. Devising a

  16. Some results of processing NURE geochemical sampling in the northern Rocky Mountain area

    SciTech Connect

    Thayer, P.A.; Cook, J.R.; Price, V. Jr.

    1980-01-01

    The National Uranium Resource Evaluation (NURE) program was begun in the spring of 1973 to evaluate domestic uranium resources in the continental United States and to identify areas favorable for uranium exploration. The significance of the distribution of uranium in natural waters and sediments will be assessed as an indicator of favorable areas for the discovery of uranium deposits. This paper is oriented primarily to the discussion of stream sediments. Data for the Challis 1/sup 0/ x 2/sup 0/ NTMS quadrangle will be used for specific samples of NURE data processing. A high-capacity neutron activation analysis facility at SRL is used to determine uranium and about 19 other elements in hydrogeochemical samples. Evaluation of the areal distributions of uranium ratios demonstrate that most of the high U/Hf, U/Th and U/(Th + Hf) ratios occur scattered throughout the western two-thirds of the quadrangle. Most of the higher ratio values are found in samples taken at sites underlain by granitic rocks of the Idaho batholith or Tertiary-age plutons.

  17. Microstructural Evaluation and Comparison of Solder Samples Processed Aboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.; Hua, F.; Anilkumar, A. V.

    2008-01-01

    Samples from the In-Space Soldering Investigation (ISSI), conducted aboard the International Space Station (ISS), are being examined for post-solidification microstructural development and porosity distribution. In this preliminary study, the internal structures of two ISSI processed samples are compared. In one case 10cm of rosin-core solder was wrapped around a coupon wire and melted by conduction, whereas, in the other a comparable length of solder was melted directly onto the hot wire; in both cases the molten solder formed ellipsoidal blobs, a shape that was maintained during subsequent solidification. In the former case, there is clear evidence of porosity throughout the sample, and an accumulation of larger pores near the hot end that implies thermocapillary induced migration and eventual coalescence of the flux vapor bubbles. In the second context, when solder was fed onto the wire. a part of the flux constituting the solder core is introduced into and remains within the liquid solder ball, becoming entombed upon solidification. In both cases the consequential porosity, particularly at a solder/contact interface, is very undesirable. In addition to compromising the desired electrical and thermal conductivity, it promotes mechanical failure.

  18. Automated CBED processing: sample thickness estimation based on analysis of zone-axis CBED pattern.

    PubMed

    Klinger, M; Němec, M; Polívka, L; Gärtnerová, V; Jäger, A

    2015-03-01

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar(+) ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. PMID:25544679

  19. Investigation of the kinetic process of solid phase microextraction in complex sample.

    PubMed

    Jiang, Ruifen; Xu, Jianqiao; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-11-01

    The presence of complex matrix in the aquatic system affects the environmental behavior of hydrophobic organic compounds (HOCs). In the current study, an automated solid-phase microextraction (SPME) desorption method was employed to study the effect of 2-hydroxypropyl-β-cyclodextrin (β-HPCD) on the kinetic process of 5 selected polyaromatic hydrocarbons (PAHs) desorbing from the fiber in aqueous sample. The results showed that the added β-HPCD facilitated the desorption rates of PAHs from SPME fiber coating, and the enhancement effect can be predicted by a proposed theoretical model. Based on this model, the kinetic parameters of organic compounds desorbing from the SPME fiber can be determined, and the calculated results showed good agreement with the experimental data. In addition, the effect of temperature on the desorption kinetic was investigated. The results found that the SPME desorption time constant increased as the sampling temperature elevated, and followed the Arrhenius equation. Also, the temperature facilitated the desorption of HOCs from the bound matrix so that increased the lability degrees of the bound compounds. Finally, a calibration method based on the proposed theoretical model was developed and applied for the analysis of unknown sample. PMID:26572846

  20. Data processing pipeline for a time-sampled imaging Fourier transform spectrometer

    NASA Astrophysics Data System (ADS)

    Naylor, David A.; Fulton, Trevor R.; Davis, Peter W.; Chapman, Ian M.; Gom, Brad G.; Spencer, Locke D.; Lindner, John V.; Nelson-Fitzpatrick, Nathan E.; Tahic, Margaret K.; Davis, Gary R.

    2004-10-01

    Imaging Fourier transform spectrometers (IFTS) are becoming the preferred systems for remote sensing spectral imaging applications because of their ability to provide, simultaneously, both high spatial and spectral resolution images of a scene. IFTS can be operated in either step-and-integrate or rapid-scan modes, where it is common practice to sample interferograms at equal optical path difference intervals. The step-and-integrate mode requires a translation stage with fast and precise point-to-point motion and additional external trigger circuitry for the detector focal plane array (FPA), and produces uniformly position-sampled interferograms which can be analyzed using standard FFT routines. In the rapid-scan mode, the translation stage is continuously moving and interferograms are often acquired at the frame-rate of the FPA. Since all translation stages have associated velocity errors, the resulting interferograms are sampled at non-uniform intervals of optical path difference, which requires more sophisticated analysis. This paper discusses the processing pipeline which is being developed for the analysis of the non-uniform rapid-scan data produced by the Herschel/SPIRE IFTS.