Science.gov

Sample records for acquisition sample processing

  1. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  2. An Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Hurowitz, J. A.; Jandura, L.; Limonadi, D.

    2012-12-01

    The Mars Science Laboratory Mission (MSL), landed on Mars on August 5. The rover and a scientific payload are designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem is the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)). It is expected that the SA/SPaH system will have produced a scooped system and possibility a drilled sample in the first 90 sols of the mission. Results from these activities and the ongoing testing program will be presented.

  3. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.

    2012-09-01

    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  4. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; Slimko, Eric; Limonadi, Daniel

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  5. Rockballer Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Giersch, Louis R.; Cook, Brant T.

    2013-01-01

    It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.

  6. Resource Prospector Instrumentation for Lunar Volatiles Prospecting, Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Captain, J.; Elphic, R.; Colaprete, A.; Zacny, Kris; Paz, A.

    2016-01-01

    the traverse path. The NS will map the water-equivalent hydrogen concentration as low as 0.5% by weight to an 80 centimeter depth as the rover traverses the lunar landscape. The NIR spectrometer will measure surficial H2O/OH as well as general mineralogy. When the prospecting instruments identify a potential volatile-rich area during the course of a traverse, the prospect is then mapped out and the most promising location identified. An augering drill capable of sampling to a depth of 100 centimeters will excavate regolith for analysis. A quick assay of the drill cuttings will be made using an operations camera and NIR spectrometer. With the water depth confirmed by this first auguring activity, a regolith sample may be extracted for processing. The drill will deliver the regolith sample to a crucible that will be sealed and heated. Evolved volatiles will be measured by a gas chromatograph-mass spectrometer and the water will be captured and photographed. RP is a solar powered mission, which given the polar location translates to a relatively short mission duration on the order of 4-15 days. This short mission duration drives the concept of operations, instrumentation, and data analysis towards critical real time analysis and decision support. Previous payload field tests have increased the fidelity of the hardware, software, and mission operations. Current activities include a mission level field test to optimize interfaces between the payload and rover as well as better understand the interaction of the science and rover teams during the mission timeline. This paper will include the current status of the science instruments on the payload as well as the integrated field test occurring in fall of 2015. The concept of operations will be discussed, including the real time science and engineering decision-making process based on the critical data from the instrumentation. The path to flight will be discussed with the approach to this ambitious low cost mission.

  7. Sample Acquisition Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Carle, Glenn C.; Stratton, David M.; Valentin, Jose R.; DeVincenzi, Donald (Technical Monitor)

    1999-01-01

    Exobiology Flight Experiments involve complex analyses conducted in environments far different than those encountered in terrestrial applications. A major part of the analytical challenge is often the selection, acquisition, delivery and, in some cases, processing of a sample suitable for the analytical requirements of the mission. The added complications of severely limited resources and sometimes rigid time constraints combine to make sample acquisition potentially a major obstacle for successful analyses. Potential samples come in a wide range including planetary atmospheric gas and aerosols (from a wide variety of pressures), planetary soil or rocks, dust and ice particles streaming off of a comet, and cemetery surface ice and rocks. Methods to collect and process sample are often mission specific, requiring continual development of innovative concepts and mechanisms. These methods must also maintain the integrity of the sample for the experimental results to be meaningful. We present here sample acquisition systems employed from past missions and proposed for future missions.

  8. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  9. Sample acquisition and instrument deployment

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1995-01-01

    Progress is reported in developing the Sample Acquisition and Instrument Deployment (SAID) system, a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. The systems have been fabricated and tested in environmental chambers, as well as soil testing and robotic control testing.

  10. The alteration of icy samples during sample acquisition

    NASA Astrophysics Data System (ADS)

    Mungas, G.; Bearman, G.; Beegle, L. W.; Hecht, M.; Peters, G. H.; Glucoft, J.; Strothers, K.

    2006-12-01

    Valid in situ scientific studies require both that samples be analyzed in as pristine condition as possible and that any modification from the pristine to the sampled state be well understood. While samples with low to high ice concentration are critical for the study of astrobiology and geology, they pose problems with respect to the sample acquisition, preparation and distribution systems (SPAD) upon which the analytical instruments depend. Most significant of the processes that occur during SPAD is sublimation or melting caused by thermal loading from drilling, coring, etc. as well as exposure to a dry low pressure ambient environment. These processes can alter the sample, as well as generating, meta-stable liquid water that can refreeze in the sample transfer mechanisms, interfering with proper operation and creating cross-contamination. We have investigated and quantified loss of volatiles such as H2O, CO, CO2, and organics contained within icy and powdered samples when acquired, processed and transferred. During development of the MSL rock crusher, for example, ice was observed to pressure-fuse and stick to the side even at -70C. We have investigated sublimation from sample acquisition at Martian temperature and pressure for a samples ranging from 10 to 100 water/dirt ratios. Using the RASP that will be on Phoenix, we have measured sublimation of ice during excavation at Martian pressure and find that the sublimation losses can range from 10 to 50 percent water. It is the thermal conductivity of the soil that determines local heat transport, and how much of the sample acquisition energy is wicked away into the soil and how much goes into the sample. Modeling of sample acquisition methods requires measurement of these parameters. There is a two phase model for thermal conductivity as a function of dirt/ice ratio but it needed to be validated. We used an ASTM method for measuring thermal conductivity and implemented it in the laboratory. The major conclusion is

  11. SNAP: Simulating New Acquisition Processes

    NASA Technical Reports Server (NTRS)

    Alfeld, Louis E.

    1997-01-01

    Simulation models of acquisition processes range in scope from isolated applications to the 'Big Picture' captured by SNAP technology. SNAP integrates a family of models to portray the full scope of acquisition planning and management activities, including budgeting, scheduling, testing and risk analysis. SNAP replicates the dynamic management processes that underlie design, production and life-cycle support. SNAP provides the unique 'Big Picture' capability needed to simulate the entire acquisition process and explore the 'what-if' tradeoffs and consequences of alternative policies and decisions. Comparison of cost, schedule and performance tradeoffs help managers choose the lowest-risk, highest payoff at each step in the acquisition process.

  12. Data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Tsuda, Toshitaka

    1989-10-01

    Fundamental methods of signal processing used in normal mesosphere stratosphere troposphere (MST) radar observations are described. Complex time series of received signals obtained in each range gate are converted into Doppler spectra, from which the mean Doppler shift, spectral width and signal-to-noise ratio (SNR) are estimated. These spectral parameters are further utilized to study characteristics of scatterers and atmospheric motions.

  13. Improving the Acquisition and Management of Sample Curation Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  14. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  15. Planetary protection issues for Mars sample acquisition flight projects.

    PubMed

    Barengoltz, J B

    2000-01-01

    The planned NASA sample acquisition flight missions to Mars pose several interesting planetary protection issues. In addition to the usual forward contamination procedures for the adequate protection of Mars for the sake of future missions, there are reasons to ensure that the sample is not contaminated by terrestrial microbes from the acquisition mission. Recent recommendations by the Space Studies Board (SSB) of the National Research Council (United States), would indicate that the scientific integrity of the sample is a planetary protection concern (SSB, 1997). Also, as a practical matter, a contaminated sample would interfere with the process for its release from quarantine after return for distribution to the interested scientists. These matters are discussed in terms of the first planned acquisition mission.

  16. Planetary protection issues for Mars sample acquisition flight projects.

    PubMed

    Barengoltz, J B

    2000-01-01

    The planned NASA sample acquisition flight missions to Mars pose several interesting planetary protection issues. In addition to the usual forward contamination procedures for the adequate protection of Mars for the sake of future missions, there are reasons to ensure that the sample is not contaminated by terrestrial microbes from the acquisition mission. Recent recommendations by the Space Studies Board (SSB) of the National Research Council (United States), would indicate that the scientific integrity of the sample is a planetary protection concern (SSB, 1997). Also, as a practical matter, a contaminated sample would interfere with the process for its release from quarantine after return for distribution to the interested scientists. These matters are discussed in terms of the first planned acquisition mission. PMID:12038483

  17. Sample Acquisition and Instrument Deployment (SAID)

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1994-01-01

    This report details the interim progress for contract NASW-4818, Sample Acquisition and Instrument Deployment (SAID), a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. A passively braked shape memory actuator with the ability to measure load has been developed. The wrist also contains a mechanism which locks the lid output to the bucket so that objects can be grasped and released for instrument deployment. The wrist actuator has been tested for operational power and mechanical functionality at Mars environmental conditions. The torque which the actuator can produce has been measured. Also, testing in Mars analogous soils has been performed.

  18. Sample Acquisition and Instrument Deployment (SAID)

    NASA Astrophysics Data System (ADS)

    Boyd, Robert C.

    1994-11-01

    This report details the interim progress for contract NASW-4818, Sample Acquisition and Instrument Deployment (SAID), a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. A passively braked shape memory actuator with the ability to measure load has been developed. The wrist also contains a mechanism which locks the lid output to the bucket so that objects can be grasped and released for instrument deployment. The wrist actuator has been tested for operational power and mechanical functionality at Mars environmental conditions. The torque which the actuator can produce has been measured. Also, testing in Mars analogous soils has been performed.

  19. Autonomous Surface Sample Acquisition for Planetary and Lunar Exploration

    NASA Astrophysics Data System (ADS)

    Barnes, D. P.

    2007-08-01

    Surface science sample acquisition is a critical activity within any planetary and lunar exploration mission, and our research is focused upon the design, implementation, experimentation and demonstration of an onboard autonomous surface sample acquisition capability for a rover equipped with a robotic arm upon which are mounted appropriate science instruments. Images captured by a rover stereo camera system can be processed using shape from stereo methods and a digital elevation model (DEM) generated. We have developed a terrain feature identification algorithm that can determine autonomously from DEM data suitable regions for instrument placement and/or surface sample acquisition. Once identified, surface normal data can be generated autonomously which are then used to calculate an arm trajectory for instrument placement and sample acquisition. Once an instrument placement and sample acquisition trajectory has been calculated, a collision detection algorithm is required to ensure the safe operation of the arm during sample acquisition.We have developed a novel adaptive 'bounding spheres' approach to this problem. Once potential science targets have been identified, and these are within the reach of the arm and will not cause any undesired collision, then the 'cost' of executing the sample acquisition activity is required. Such information which includes power expenditure and duration can be used to select the 'best' target from a set of potential targets. We have developed a science sample acquisition resource requirements calculation that utilises differential inverse kinematics methods to yield a high fidelity result, thus improving upon simple 1st order approximations. To test our algorithms a new Planetary Analogue Terrain (PAT) Laboratory has been created that has a terrain region composed of Mars Soil Simulant-D from DLR Germany, and rocks that have been fully characterised in the laboratory. These have been donated by the UK Planetary Analogue Field Study

  20. Harpoon-based sample Acquisition System

    NASA Astrophysics Data System (ADS)

    Bernal, Javier; Nuth, Joseph; Wegel, Donald

    2012-02-01

    Acquiring information about the composition of comets, asteroids, and other near Earth objects is very important because they may contain the primordial ooze of the solar system and the origins of life on Earth. Sending a spacecraft is the obvious answer, but once it gets there it needs to collect and analyze samples. Conceptually, a drill or a shovel would work, but both require something extra to anchor it to the comet, adding to the cost and complexity of the spacecraft. Since comets and asteroids are very low gravity objects, drilling becomes a problem. If you do not provide a grappling mechanism, the drill would push the spacecraft off the surface. Harpoons have been proposed as grappling mechanisms in the past and are currently flying on missions such as ROSETTA. We propose to use a hollow, core sampling harpoon, to act as the anchoring mechanism as well as the sample collecting device. By combining these two functions, mass is reduced, more samples can be collected and the spacecraft can carry more propellant. Although challenging, returning the collected samples to Earth allows them to be analyzed in laboratories with much greater detail than possible on a spacecraft. Also, bringing the samples back to Earth allows future generations to study them.

  1. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  2. Generalized analog thresholding for spike acquisition at ultralow sampling rates.

    PubMed

    He, Bryan D; Wein, Alex; Varshney, Lav R; Kusuma, Julius; Richardson, Andrew G; Srinivasan, Lakshminarayan

    2015-07-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  3. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  4. Rotary Percussive Sample Acquisition Tool (SAT): Hardware Development and Testing

    NASA Technical Reports Server (NTRS)

    Klein, Kerry; Badescu, Mircea; Haddad, Nicolas; Shiraishi, Lori; Walkemeyer, Phillip

    2012-01-01

    In support of a potential Mars Sample Return (MSR) mission an Integrated Mars Sample Acquisition and Handling (IMSAH) architecture has been proposed to provide a means for Rover-based end-to-end sample capture and caching. A key enabling feature of the architecture is the use of a low mass sample Acquisition Tool (SAT) that is capable of drilling and capturing rock cores directly within a sample tube in order to maintain sample integrity and prevent contamination across the sample chain. As such, this paper will describe the development and testing of a low mass rotary percussive SAT that has been shown to provide a means for core generation, fracture, and capture.

  5. Processability Theory and German Case Acquisition

    ERIC Educational Resources Information Center

    Baten, Kristof

    2011-01-01

    This article represents the first attempt to formulate a hypothetical sequence for German case acquisition by Dutch-speaking learners on the basis of Processability Theory (PT). It will be argued that case forms emerge corresponding to a development from lexical over phrasal to interphrasal morphemes. This development, however, is subject to a…

  6. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  7. Guiding visual attention during acquisition of matching-to-sample.

    PubMed

    Mackay, Harry A; Soraci, Sal A; Carlin, Michael T; Dennis, Nancy A; Strawbridge, Christina P

    2002-11-01

    Matching-to-sample skills are involved in language acquisition and the development of basic reading and counting abilities. The rapid, even errorless, induction of matching performances in young children and individuals with mental retardation was demonstrated here through the structuring of a visual array so as to promote detection of the relevant stimulus. Implications for theory and application are discussed.

  8. Acquisition of data by whole sample enrichment, real-time polymerase chain reaction for development of a process risk model for Salmonella and chicken parts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Process risk models predict consumer exposure and response to pathogens in food produced by specific scenarios. A process risk model for Salmonella and chicken parts was developed that consisted of four unit operations (pathogen events): 1) meal preparation (contamination); 2) cooking (death); 3) s...

  9. Acquisition and Retaining Granular Samples via a Rotating Coring Bit

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart

    2013-01-01

    This device takes advantage of the centrifugal forces that are generated when a coring bit is rotated, and a granular sample is entered into the bit while it is spinning, making it adhere to the internal wall of the bit, where it compacts itself into the wall of the bit. The bit can be specially designed to increase the effectiveness of regolith capturing while turning and penetrating the subsurface. The bit teeth can be oriented such that they direct the regolith toward the bit axis during the rotation of the bit. The bit can be designed with an internal flute that directs the regolith upward inside the bit. The use of both the teeth and flute can be implemented in the same bit. The bit can also be designed with an internal spiral into which the various particles wedge. In another implementation, the bit can be designed to collect regolith primarily from a specific depth. For that implementation, the bit can be designed such that when turning one way, the teeth guide the regolith outward of the bit and when turning in the opposite direction, the teeth will guide the regolith inward into the bit internal section. This mechanism can be implemented with or without an internal flute. The device is based on the use of a spinning coring bit (hollow interior) as a means of retaining granular sample, and the acquisition is done by inserting the bit into the subsurface of a regolith, soil, or powder. To demonstrate the concept, a commercial drill and a coring bit were used. The bit was turned and inserted into the soil that was contained in a bucket. While spinning the bit (at speeds of 600 to 700 RPM), the drill was lifted and the soil was retained inside the bit. To prove this point, the drill was turned horizontally, and the acquired soil was still inside the bit. The basic theory behind the process of retaining unconsolidated mass that can be acquired by the centrifugal forces of the bit is determined by noting that in order to stay inside the interior of the bit, the

  10. Adaptive processing for enhanced target acquisition

    NASA Astrophysics Data System (ADS)

    Page, Scott F.; Smith, Moira I.; Hickman, Duncan; Bernhardt, Mark; Oxford, William; Watson, Norman; Beath, F.

    2009-05-01

    Conventional air-to-ground target acquisition processes treat the image stream in isolation from external data sources. This ignores information that may be available through modern mission management systems which could be fused into the detection process in order to provide enhanced performance. By way of an example relating to target detection, this paper explores the use of a-priori knowledge and other sensor information in an adaptive architecture with the aim of enhancing performance in decision making. The approach taken here is to use knowledge of target size, terrain elevation, sensor geometry, solar geometry and atmospheric conditions to characterise the expected spatial and radiometric characteristics of a target in terms of probability density functions. An important consideration in the construction of the target probability density functions are the known errors in the a-priori knowledge. Potential targets are identified in the imagery and their spatial and expected radiometric characteristics are used to compute the target likelihood. The adaptive architecture is evaluated alongside a conventional non-adaptive algorithm using synthetic imagery representative of an air-to-ground target acquisition scenario. Lastly, future enhancements to the adaptive scheme are discussed as well as strategies for managing poor quality or absent a-priori information.

  11. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  12. IWTU Process Sample Analysis Report

    SciTech Connect

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  13. Sample Acquisition and Handling System from a Remote Platform

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Sherrit, Stewart; Jones, Jack A.

    2011-01-01

    A system has been developed to acquire and handle samples from a suspended remote platform. The system includes a penetrator, a penetrator deployment mechanism, and a sample handler. A gravity-driven harpoon sampler was used for the system, but other solutions can be used to supply the penetration energy, such as pyrotechnic, pressurized gas, or springs. The deployment mechanism includes a line that is attached to the penetrator, a spool for reeling in the line, and a line engagement control mechanism. The penetrator has removable tips that can collect liquid, ice, or solid samples. The handling mechanism consists of a carousel that can store a series of identical or different tips, assist in penetrator reconfiguration for multiple sample acquisition, and deliver the sample to a series of instruments for analysis. The carousel sample handling system was combined with a brassboard reeling mechanism and a penetrator with removable tips. It can attach the removable tip to the penetrator, release and retrieve the penetrator, remove the tip, and present it to multiple instrument stations. The penetrator can be remotely deployed from an aerobot, penetrate and collect the sample, and be retrieved with the sample to the aerobot. The penetrator with removable tips includes sample interrogation windows and a sample retainment spring for unconsolidated samples. The line engagement motor can be used to control the penetrator release and reeling engagement, and to evenly distribute the line on the spool by rocking between left and right ends of the spool. When the arm with the guiding ring is aligned with the spool axis, the line is free to unwind from the spool without rotating the spool. When the arm is perpendicular to the spool axis, the line can move only if the spool rotates.

  14. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  15. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  16. Collecting cometary soil samples? Development of the ROSETTA sample acquisition system

    NASA Technical Reports Server (NTRS)

    Coste, P. A.; Fenzi, M.; Eiden, Michael

    1993-01-01

    In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.

  17. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  18. Acquisition by Processing: A Modular Perspective on Language Development

    ERIC Educational Resources Information Center

    Truscott, John; Smith, Mike Sharwood

    2004-01-01

    The paper offers a model of language development, first and second, within a processing perspective. We first sketch a modular view of language, in which competence is embodied in the processing mechanisms. We then propose a novel approach to language acquisition (Acquisition by Processing Theory, or APT), in which development of the module occurs…

  19. Feasibility Study of Commercial Markets for New Sample Acquisition Devices

    NASA Technical Reports Server (NTRS)

    Brady, Collin; Coyne, Jim; Bilen, Sven G.; Kisenwether, Liz; Miller, Garry; Mueller, Robert P.; Zacny, Kris

    2010-01-01

    The NASA Exploration Systems Mission Directorate (ESMD) and Penn State technology commercialization project was designed to assist in the maturation of a NASA SBIR Phase III technology. The project was funded by NASA's ESMD Education group with oversight from the Surface Systems Office at NASA Kennedy Space Center in the Engineering Directorate. Two Penn State engineering student interns managed the project with support from Honeybee Robotics and NASA Kennedy Space Center. The objective was to find an opportunity to integrate SBIR-developed Regolith Extractor and Sampling Technology as the payload for the future Lunar Lander or Rover missions. The team was able to identify two potential Google Lunar X Prize organizations with considerable interest in utilizing regolith acquisition and transfer technology.

  20. Dynamic Acquisition and Retrieval Tool (DART) for Comet Sample Return : Session: 2.06.Robotic Mobility and Sample Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Bonitz, Robert; Kulczycki, Erick; Aisen, Norman; Dandino, Charles M.; Cantrell, Brett S.; Gallagher, William; Shevin, Jesse; Ganino, Anthony; Haddad, Nicolas; Walkemeyer, Phillip; Backes, Paul; Shiraishi, Lori

    2013-01-01

    The 2011 Decadal Survey for planetary science released by the National Research Council of the National Academies identified Comet Surface Sample Return (CSSR) as one of five high priority potential New Frontiers-class missions in the next decade. The main objectives of the research described in this publication are: develop a concept for an end-to-end system for collecting and storing a comet sample to be returned to Earth; design, fabricate and test a prototype Dynamic Acquisition and Retrieval Tool (DART) capable of collecting 500 cc sample in a canister and eject the canister with a predetermined speed; identify a set of simulants with physical properties at room temperature that suitably match the physical properties of the comet surface as it would be sampled. We propose the use of a dart that would be launched from the spacecraft to impact and penetrate the comet surface. After collecting the sample, the sample canister would be ejected at a speed greater than the comet's escape velocity and captured by the spacecraft, packaged into a return capsule and returned to Earth. The dart would be composed of an inner tube or sample canister, an outer tube, a decelerator, a means of capturing and retaining the sample, and a mechanism to eject the canister with the sample for later rendezvous with the spacecraft. One of the significant unknowns is the physical properties of the comet surface. Based on new findings from the recent Deep Impact comet encounter mission, we have limited our search of solutions for sampling materials to materials with 10 to 100 kPa shear strength in loose or consolidated form. As the possible range of values for the comet surface temperature is also significantly different than room temperature and testing at conditions other than the room temperature can become resource intensive, we sought sample simulants with physical properties at room temperature similar to the expected physical properties of the comet surface material. The chosen

  1. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  2. Process data acquisition: real-time and historical interfaces

    NASA Astrophysics Data System (ADS)

    Rice, Gordon; Moreno, Richard; King, Michael S.

    1997-01-01

    With the advent of touch probe technology, it was discovered that current closed architecture controllers do not provide adequate resources to support the implementation of process data acquisition on the shop floor. At AlliedSignal, a process data acquisition systems has been developed for a flexible manufacturing system utilizing touch probe and customized software which allows fixture and cutting tool related information for an entire process to be captured and stored for off-line analysis. The implementation of this system, the difficulties and pitfalls, will be presented along with the functionality required for an open architecture controller to properly support process data acquisition.

  3. Process data acquisition: Real time and historical interfaces

    SciTech Connect

    Rice, G.; Moreno, R.; King, M.

    1996-11-01

    With the advent of touch probe technology, it was discovered that current closed architecture controllers do not provide adequate resources to support the implementation of process data acquisition on the shop floor. At AlliedSignal Federal Manufacturing & Technologies, a process data acquisition system has been developed for a flexible manufacturing system utilizing touch probes and customized software which allows fixture and cutting tool related information for an entire process to be captured and stored for off-line analysis. The implementation of this system, the difficulties and pitfalls, will be presented along with the functionality required for an open architecture controller to properly support process data acquisition.

  4. An effective data acquisition system using image processing

    NASA Astrophysics Data System (ADS)

    Poh, Chung-How; Poh, Chung-Kiak

    2005-12-01

    The authors investigate a data acquisition system utilising the widely available digital multi-meter and the webcam. The system is suited for applications that require sampling rates of less than about 1 Hz, such as for ambient temperature recording or the monitoring of the charging state of rechargeable batteries. The data displayed on the external digital readout is acquired into the computer through the process of template matching. MATLAB is used as the programming language for processing the captured 2-D images in this demonstration. A RC charging experiment with a time characteristic of approximately 33 s is setup to verify the accuracy of the image-to-data conversion. It is found that the acquired data matches the steady-state voltage value displayed by the digital meter after an error detection technique has been devised and implemented into the data acquisition script file. It is possible to acquire a number of different readings simultaneously from various sources with this imaging method by placing a number of digital readouts within the camera's field-of-view.

  5. System of acquisition and processing of images of dynamic speckle

    NASA Astrophysics Data System (ADS)

    Vega, F.; >C Torres,

    2015-01-01

    In this paper we show the design and implementation of a system to capture and analysis of dynamic speckle. The device consists of a USB camera, an isolated system lights for imaging, a laser pointer 633 nm 10 mw as coherent light source, a diffuser and a laptop for processing video. The equipment enables the acquisition and storage of video, also calculated of different descriptors of statistical analysis (vector global accumulation of activity, activity matrix accumulation, cross-correlation vector, autocorrelation coefficient, matrix Fujji etc.). The equipment is designed so that it can be taken directly to the site where the sample for biological study and is currently being used in research projects within the group.

  6. Fast acquisition of multidimensional NMR spectra of solids and mesophases using alternative sampling methods.

    PubMed

    Lesot, Philippe; Kazimierczuk, Krzysztof; Trébosc, Julien; Amoureux, Jean-Paul; Lafon, Olivier

    2015-11-01

    Unique information about the atom-level structure and dynamics of solids and mesophases can be obtained by the use of multidimensional nuclear magnetic resonance (NMR) experiments. Nevertheless, the acquisition of these experiments often requires long acquisition times. We review here alternative sampling methods, which have been proposed to circumvent this issue in the case of solids and mesophases. Compared to the spectra of solutions, those of solids and mesophases present some specificities because they usually display lower signal-to-noise ratios, non-Lorentzian line shapes, lower spectral resolutions and wider spectral widths. We highlight herein the advantages and limitations of these alternative sampling methods. A first route to accelerate the acquisition time of multidimensional NMR spectra consists in the use of sparse sampling schemes, such as truncated, radial or random sampling ones. These sparsely sampled datasets are generally processed by reconstruction methods differing from the Discrete Fourier Transform (DFT). A host of non-DFT methods have been applied for solids and mesophases, including the G-matrix Fourier transform, the linear least-square procedures, the covariance transform, the maximum entropy and the compressed sensing. A second class of alternative sampling consists in departing from the Jeener paradigm for multidimensional NMR experiments. These non-Jeener methods include Hadamard spectroscopy as well as spatial or orientational encoding of the evolution frequencies. The increasing number of high field NMR magnets and the development of techniques to enhance NMR sensitivity will contribute to widen the use of these alternative sampling methods for the study of solids and mesophases in the coming years.

  7. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  8. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  9. Towards a Platform for Image Acquisition and Processing on RASTA

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele

    2013-08-01

    This paper presents the architecture of a platform for image acquisition and processing based on commercial hardware and space qualified hardware. The aim is to extend the Reference Architecture Test-bed for Avionics (RASTA) system in order to obtain a Test-bed that allows testing different hardware and software solutions in the field of image acquisition and processing. The platform will allow the integration of space qualified hardware and Commercial Off The Shelf (COTS) hardware in order to test different architectural configurations. The first implementation is being performed on a low cost commercial board and on the GR712RC board based on the Dual Core Leon3 fault tolerant processor. The platform will include an actuation module with the aim of implementing a complete pipeline from image acquisition to actuation, making possible the simulation of a real case scenario involving acquisition and actuation.

  10. The Gestalt Process Approach and Word Acquisition.

    ERIC Educational Resources Information Center

    McAllister, Elizabeth

    To whet the curiosity and interest of teachers who may be frustrated with the reading vocabulary achievement of pupils, an informal study compared Piaget's cognitive development theory, recent brain research, and the reading process, and examined how the theory and research apply to reading instruction. The Gestalt Process Approach to teaching…

  11. Language Processes and Second-Language Acquisition.

    ERIC Educational Resources Information Center

    Collins, Larry Lloyd

    A review of the literature and research concerning the language processes of listening, speaking, reading, and writing, and an analysis of the findings regarding the characteristics of these processes and their relationship to the second-language learner led to the following conclusions: (1) the circumstances under which the first language is…

  12. ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS

    EPA Science Inventory

    R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...

  13. Reading Acquisition Enhances an Early Visual Process of Contour Integration

    ERIC Educational Resources Information Center

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched…

  14. SALTSTONE PROCESSING FACILITY TRANSFER SAMPLE

    SciTech Connect

    Cozzi, A.; Reigel, M.

    2010-08-04

    On May 19, 2010, the Saltstone Production Facility inadvertently transferred 1800 gallons of untreated waste from the salt feed tank to Vault 4. During shut down, approximately 70 gallons of the material was left in the Saltstone hopper. A sample of the slurry in the hopper was sent to Savannah River National Laboratory (SRNL) to analyze the density, pH and the eight Resource Conservation and Recovery Act (RCRA) metals. The sample was hazardous for chromium, mercury and pH. The sample received from the Saltstone hopper was analyzed visually while obtaining sample aliquots and while the sample was allowed to settle. It was observed that the sample contains solids that settle in approximately 20 minutes (Figure 3-1). There is a floating layer on top of the supernate during settling and disperses when the sample is agitated (Figure 3-2). The untreated waste inadvertently transferred from the SFT to Vault 4 was toxic for chromium and mercury. In addition, the pH of the sample is at the regulatory limit. Visually inspecting the sample indicates solids present in the sample.

  15. Sample Acquisition and Caching Architectures for the Mars 2020 Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Chu, P.; Paulsen, G.; Davis, K.

    2014-06-01

    The goal of the Mars 2020 mission is to acquire 37 samples for future return. We present technologies to capture cores and regolith, to abrade rocks, and to inspect cores prior to caching. We also present three caching architectures.

  16. Earth contamination free sample acquisition from an Earth Contaminated Spacecraft

    NASA Technical Reports Server (NTRS)

    Dolgin, B.; Bickler, D.; Carson, J.; Chung, S.; Quicksall, J.; Troy, R.; Yarbrough, C.

    2000-01-01

    The paper describes the first step in the feasibility demonstration of a novel low cost Mars Sample Return Transfer Sequence (STS) that does not require cleaning and sterilization of the entire spacecraft. The proposed STS relies on ability to collect (and in the future deliver to Earth) Earth-contamination-free samples from a spacecraft that was cleaned only to the levels achieved on the Pathfinder.

  17. Thermal mapping and trends of Mars analog materials in sample acquisition operations using experimentation and models

    NASA Astrophysics Data System (ADS)

    Szwarc, Timothy; Hubbard, Scott

    2014-09-01

    The effects of atmosphere, ambient temperature, and geologic material were studied experimentally and using a computer model to predict the heating undergone by Mars rocks during rover sampling operations. Tests were performed on five well-characterized and/or Mars analog materials: Indiana limestone, Saddleback basalt, kaolinite, travertine, and water ice. Eighteen tests were conducted to 55 mm depth using a Mars Sample Return prototype coring drill, with each sample containing six thermal sensors. A thermal simulation was written to predict the complete thermal profile within each sample during coring and this model was shown to be capable of predicting temperature increases with an average error of about 7%. This model may be used to schedule power levels and periods of rest during actual sample acquisition processes to avoid damaging samples or freezing the bit into icy formations. Maximum rock temperature increase is found to be modeled by a power law incorporating rock and operational parameters. Energy transmission efficiency in coring is found to increase linearly with rock hardness and decrease by 31% at Mars pressure.

  18. Understanding the knowledge acquisition process about Earth and Space concepts

    NASA Astrophysics Data System (ADS)

    Frappart, Soren

    There exist two main theoretical views concerning the knowledge acquisition process in science. Those views are still in debate in the literature. On the one hand, knowledge is considered to be organized into coherent wholes (mental models). On the other hand knowledge is described as fragmented sets with no link between the fragments. Mental models have a predictive and explicative power and are constrained by universal presuppositions. They follow a universal gradual development in three steps from initial, synthetic to scientific models. On the contrary, the fragments are not organised and development is seen as a situated process where cultural transmission plays a fundamental role. After a presentation of those two theoretical positions, we will illustrate them with examples of studies related to the Earth Shape and gravity performed in different cultural contexts in order to enhance both the differences and the invariant cultural elements. We will show how those problematic are important to take into account and to question for space concepts, like gravity, orbits, weightlessness for instance. Indeed capturing the processes of acquisition and development of knowledge concerning specific space concepts can give us important information to develop relevant and adapted strategies for instruction. If the process of knowledge acquisition for Space concepts is fragmented then we have to think of how we could identify those fragments and help the learner organise links between them. If the knowledge is organised into coherent mental models, we have to think of how to destabilize a non relevant model and to prevent from the development of initial and synthetic models. Moreover the question of what is universal versus what is culture dependant in this acquisition process need to be explored. We will also present some main misconceptions that appeared about Space concepts. Indeed, additionally to the previous theoretical consideration, the collection and awareness of

  19. Autonomous site selection and instrument positioning for sample acquisition

    NASA Astrophysics Data System (ADS)

    Shaw, A.; Barnes, D.; Pugh, S.

    The European Space Agency Aurora Exploration Program aims to establish a European long-term programme for the exploration of Space, culminating in a human mission to space in the 2030 timeframe. Two flagship missions, namely Mars Sample Return and ExoMars, have been proposed as recognised steps along the way. The Exomars Rover is the first of these flagship missions and includes a rover carrying the Pasteur Payload, a mobile exobiology instrumentation package, and the Beagle 2 arm. The primary objective is the search for evidence of past or present life on mars, but the payload will also study the evolution of the planet and the atmosphere, look for evidence of seismological activity and survey the environment in preparation for future missions. The operation of rovers in unknown environments is complicated, and requires large resources not only on the planet but also in ground based operations. Currently, this can be very labour intensive, and costly, if large teams of scientists and engineers are required to assess mission progress, plan mission scenarios, and construct a sequence of events or goals for uplink. Furthermore, the constraints in communication imposed by the time delay involved over such large distances, and line-of-sight required, make autonomy paramount to mission success, affording the ability to operate in the event of communications outages and be opportunistic with respect to scientific discovery. As part of this drive to reduce mission costs and increase autonomy the Space Robotics group at the University of Wales, Aberystwyth is researching methods of autonomous site selection and instrument positioning, directly applicable to the ExoMars mission. The site selection technique used builds on the geometric reasoning algorithms used previously for localisation and navigation [Shaw 03]. It is proposed that a digital elevation model (DEM) of the local surface, generated during traverse and without interaction from ground based operators, can be

  20. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  1. The logical syntax of number words: theory, acquisition and processing.

    PubMed

    Musolino, Julien

    2009-04-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language

  2. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  3. Reading acquisition enhances an early visual process of contour integration.

    PubMed

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched in age, socioeconomic and cultural characteristics, on a contour integration task known to depend on early visual processing. Stimuli consisted of a closed egg-shaped contour made of disconnected Gabor patches, within a background of randomly oriented Gabor stimuli. Subjects had to decide whether the egg was pointing left or right. Difficulty was varied by jittering the orientation of the Gabor patches forming the contour. Contour integration performance was lower in illiterates than in both ex-illiterate and literate controls. We argue that this difference in contour perception must reflect a genuine difference in visual function. According to this view, the intensive perceptual training that accompanies reading acquisition also improves early visual abilities, suggesting that the impact of literacy on the visual system is more widespread than originally proposed.

  4. PET/CT for radiotherapy: image acquisition and data processing.

    PubMed

    Bettinardi, V; Picchio, M; Di Muzio, N; Gianolli, L; Messa, C; Gilardi, M C

    2010-10-01

    This paper focuses on acquisition and processing methods in positron emission tomography/computed tomography (PET/CT) for radiotherapy (RT) applications. The recent technological evolutions of PET/CT systems are described. Particular emphasis is dedicated to the tools needed for the patient positioning and immobilization, to be used in PET/CT studies as well as during RT treatment sessions. The effect of organ and lesion motion due to patient's respiration on PET/CT imaging is discussed. Breathing protocols proposed to minimize PET/CT spatial mismatches in relation to respiratory movements are illustrated. The respiratory gated (RG) 4D-PET/CT techniques, developed to measure and compensate for organ and lesion motion, are then introduced. Finally a description is provided of different acquisition and data processing techniques, implemented with the aim at improving: i) image quality and quantitative accuracy of PET images, and ii) target volume definition and treatment planning in RT, by using specific and personalised motion information.

  5. Design and implementation of a compressive infrared sampling for motion acquisition

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Liu, Jun

    2014-12-01

    This article proposes a compressive infrared sampling method in pursuit of the acquisition and processing of human motion simultaneously. The spatial-temporal changes caused by the movements of the human body are intrinsical clues for determining the semantics of motion, while the movements of short-term changes can be considered as a sparse distribution compared with the sensing region. Several pyroelectric infrared (PIR) sensors with pseudo-random-coded Fresnel lenses are introduced to acquire and compress motion information synchronously. The compressive PIR array has the ability to record the changes in the thermal radiation field caused by movements and encode the motion information into low-dimensional sensory outputs directly. Therefore, the problem of recognizing a high-dimensional image sequence is cast as a low-dimensional sequence recognition process. A database involving various kinds of motion played by several people is built. Hausdorff distance-based template matching is employed for motion recognition. Experimental studies are conducted to validate the proposed method.

  6. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, John E.; Dinsmore, Stanley R.; Chandler, Edward W.

    1986-01-01

    A four-port disc valve for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of .alpha. silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions.

  7. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  8. Real-time digital design for an optical coherence tomography acquisition and processing system

    NASA Astrophysics Data System (ADS)

    Ralston, Tyler S.; Mayen, Jose A.; Marks, Dan L.; Boppart, Stephen A.

    2004-07-01

    We present a real-time, multi-dimensional, digital, optical coherence tomography (OCT) acquisition and imaging system. The system consists of conventional OCT optics, a rapid scanning optical delay (RSOD) line to support fast data acquisition rates, and a high-speed A/D converter for sampling the interference waveforms. A 1M-gate Virtex-II field programmable gate array (FPGA) is designed to perform digital down conversion. This is analogous to demodulating and low-pass filtering the continuous time signal. The system creates in-phase and quadrature-phase components using a tunable quadrature mixer. Multistage polyphase finite impulse response (FIR) filtering and down sampling is used to remove unneeded high frequencies. A floating-point digital signal processor (DSP) computes the magnitude and phase shifts. The data is read by a host machine and displayed on screen at real-time rates commensurate with the data acquisition rate. This system offers flexible acquisition and processing parameters for a wide range of multi-dimensional optical microscopy techniques.

  9. A needle-free technique for interstitial fluid sample acquisition using a lorentz-force actuated jet injector.

    PubMed

    Chang, Jean H; Hogan, N Catherine; Hunter, Ian W

    2015-08-10

    We present a novel method of quickly acquiring dermal interstitial fluid (ISF) samples using a Lorentz-force actuated needle-free jet injector. The feasibility of the method is first demonstrated on post-mortem porcine tissue. The jet injector is used to first inject a small volume of physiological saline to breach the skin, and the back-drivability of the actuator is utilized to create negative pressure in the ampoule and collect ISF. The effect of the injection and extraction parameters on sample dilution and extracted volumes is investigated. A simple finite element model is developed to demonstrate why this acquisition method results in faster extractions than conventional sampling methods. Using this method, we are able to collect a sample that contains up to 3.5% ISF in 3.1s from post-mortem skin. The trends revealed from experimentation on post-mortem skin are then used to identify the parameters for a live animal study. The feasibility of the acquisition process is successfully demonstrated using live rats; the process is revealed to extract samples that have been diluted by a factor of 111-125.

  10. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  11. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    PubMed

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.

  12. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    PubMed

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices. PMID:26396082

  13. Multiwavelength lidar: challenges of data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Duggirala, Ramakrishna Rao; Santhibhavan Vasudevanpillai, Mohankumar; Bhargavan, Presennakumar; Sivarama Pillai, Muraleedharen Nair; Malladi, Satyanarayana

    2006-12-01

    LIDAR operates by transmitting light pulses of few nanoseconds width into the atmosphere and receiving signals backscattered from different layers of aerosols and clouds from the atmosphere to derive vertical profiles of the physical and optical properties with good spatial resolution. The Data Acquisition System (DAS) of the LIDAR has to handle signals of wide dynamic range (of the order of 5 to 6 decades), and the data have to be sampled at high speeds (more than 10 MSPS) to get spatial resolution of few metre. This results in large amount of data to be collected in a short duration. The ground based Multiwavelength LIDAR built in Space Physics Laboratory, Vikram Sarabhai Space Centre, Trivandrum is capable of operating at four wavelengths namely 1064, 532, 355 and 266 nm with a PRF of 1 to 20 Hz. The LIDAR has been equipped with a computer controlled DAS. An Avalanche Photo Diode (APD) detector is used for the detection of return signal from different layers of atmosphere in 1064 nm channel. The signal is continuous in nature and is sampled and digitized at the required spatial resolution in the data acquisition window corresponding to the height region of 0 to 45 km. The return signal which is having wide dynamic range is handled by two fast, 12 bit A/D converters set to different full scale voltage ranges, and sampling upto 40 MSPS (corresponding to the range resolution of few metre). The other channels, namely 532, 355 and 266 nm are detected by Photo Multiplier Tubes (PMT), which have higher quantum efficiency at these wavelengths. The PMT output can be either continuous or discrete pulses depending upon the region of probing. Thick layers like clouds and dust generate continuous signal whereas molecular scattering from the higher altitude regions result in discrete signal pulses. The return signals are digitized using fast A/D converters (upto 40 MSPS) as well as counted using fast photon counters. The photon counting channels are capable of counting upto

  14. Acquisition of matching-to-sample performance in rats using visual stimuli on nose keys.

    PubMed

    Iversen, I H

    1993-05-01

    Steady and blinking white lights were projected on three nose keys arranged horizontally on one wall. The procedure was a conditional discrimination with a sample stimulus presented on the middle key and comparison stimuli on the side keys. Three rats acquired simultaneous "identity matching." Accuracy reached 80% in about 25 sessions and 90% or higher after about 50 sessions. Acquisition progressed through several stages of repeated errors, alteration between comparison keys from trial to trial, preference of specific keys or stimuli, and a gradual lengthening of strings of consecutive trials with correct responses. An analysis of the acquisition curves for individual trial configurations indicated that the matching-to-sample performance possibly consisted of separate discriminations. PMID:8315365

  15. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1984-08-16

    This is a patent for a disc-type, four-port sampling valve for service with erosive high temperature process streams. Inserts and liners of ..cap alpha..-silicon carbide respectively, in the faceplates and in the sampling cavities, limit erosion while providing lubricity for a smooth and precise operation. 1 fig.

  16. An efficient scheme for sampling fast dynamics at a low average data acquisition rate.

    PubMed

    Philippe, A; Aime, S; Roger, V; Jelinek, R; Prévot, G; Berthier, L; Cipelletti, L

    2016-02-24

    We introduce a temporal scheme for data sampling, based on a variable delay between two successive data acquisitions. The scheme is designed so as to reduce the average data flow rate, while still retaining the information on the data evolution on fast time scales. The practical implementation of the scheme is discussed and demonstrated in light scattering and microscopy experiments that probe the dynamics of colloidal suspensions using CMOS or CCD cameras as detectors.

  17. Troubleshooting digital macro photography for image acquisition and the analysis of biological samples.

    PubMed

    Liepinsh, Edgars; Kuka, Janis; Dambrova, Maija

    2013-01-01

    For years, image acquisition and analysis have been an important part of life science experiments to ensure the adequate and reliable presentation of research results. Since the development of digital photography and digital planimetric methods for image analysis approximately 20 years ago, new equipment and technologies have emerged, which have increased the quality of image acquisition and analysis. Different techniques are available to measure the size of stained tissue samples in experimental animal models of disease; however, the most accurate method is digital macro photography with software that is based on planimetric analysis. In this study, we described the methodology for the preparation of infarcted rat heart and brain tissue samples before image acquisition, digital macro photography techniques and planimetric image analysis. These methods are useful in the macro photography of biological samples and subsequent image analysis. In addition, the techniques that are described in this study include the automated analysis of digital photographs to minimize user input and exclude the risk of researcher-generated errors or bias during image analysis.

  18. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  19. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1986-01-07

    A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.

  20. Sample Acquisition and Analytical Chemistry Challenges to Verifying Compliance to Aviators Breathing Oxygen (ABO) Purity Specification

    NASA Technical Reports Server (NTRS)

    Graf, John

    2015-01-01

    NASA has been developing and testing two different types of oxygen separation systems. One type of oxygen separation system uses pressure swing technology, the other type uses a solid electrolyte electrochemical oxygen separation cell. Both development systems have been subjected to long term testing, and performance testing under a variety of environmental and operational conditions. Testing these two systems revealed that measuring the product purity of oxygen, and determining if an oxygen separation device meets Aviator's Breathing Oxygen (ABO) specifications is a subtle and sometimes difficult analytical chemistry job. Verifying product purity of cryogenically produced oxygen presents a different set of analytical chemistry challenges. This presentation will describe some of the sample acquisition and analytical chemistry challenges presented by verifying oxygen produced by an oxygen separator - and verifying oxygen produced by cryogenic separation processes. The primary contaminant that causes gas samples to fail to meet ABO requirements is water. The maximum amount of water vapor allowed is 7 ppmv. The principal challenge of verifying oxygen produced by an oxygen separator is that it is produced relatively slowly, and at comparatively low temperatures. A short term failure that occurs for just a few minutes in the course of a 1 week run could cause an entire tank to be rejected. Continuous monitoring of oxygen purity and water vapor could identify problems as soon as they occur. Long term oxygen separator tests were instrumented with an oxygen analyzer and with an hygrometer: a GE Moisture Monitor Series 35. This hygrometer uses an aluminum oxide sensor. The user's manual does not report this, but long term exposure to pure oxygen causes the aluminum oxide sensor head to bias dry. Oxygen product that exceeded the 7 ppm specification was improperly accepted, because the sensor had biased. The bias is permanent - exposure to air does not cause the sensor to

  1. The Logical Syntax of Number Words: Theory, Acquisition and Processing

    ERIC Educational Resources Information Center

    Musolino, Julien

    2009-01-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. "Cognition 93", 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar…

  2. Chapter A5. Processing of Water Samples

    USGS Publications Warehouse

    Wilde, Franceska D.; Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.

    1999-01-01

    The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.

  3. Quality evaluation of processed clay soil samples

    PubMed Central

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    Introduction This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Results Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. “Small” market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. Conclusion The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed.

  4. Quality evaluation of processed clay soil samples

    PubMed Central

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    Introduction This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Results Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. “Small” market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. Conclusion The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed. PMID:27642456

  5. Mars sampling strategy and aeolian processes

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald

    1988-01-01

    It is critical that the geological context of planetary samples (both in situ analyses and return samples) be well known and documented. Apollo experience showed that this goal is often difficult to achieve even for a planet on which surficial processes are relatively restricted. On Mars, the variety of present and past surface processes is much greater than on the Moon and establishing the geological context of samples will be much more difficult. In addition to impact hardening, Mars has been modified by running water, periglacial activity, wind, and other processes, all of which have the potential for profoundly affecting the geological integrity of potential samples. Aeolian, or wind, processes are ubiquitous on Mars. In the absence of liquid water on the surface, aeolian activity dominates the present surface as documented by frequent dust storms (both local and global), landforms such as dunes, and variable features, i.e., albedo patterns which change their size, shape, and position with time in response to the wind.

  6. Proposed Science Requirements and Acquisition Priorities for the First Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Agee, C. B.; Bogard, D. D.; Draper, D. S.; Jones, J. H.; Meyer , C., Jr.; Mittlefehldt, D. W.

    2000-01-01

    A sample return mission is an important next step in the exploration of Mars. The first sample return should come early in the program time-line because the science derived from earth-based analyses of samples provides crucial "ground truth" needed for further exploration planning, enhancement of remote measurements, and achieving science goals and objectives that include: (1) the search for environments that may support life and any indicators of the past or present existence of life, (2) understanding the history of water and climate on Mars, (3) understanding the evolution of Mars as a planet. Returned samples from Mars will have unique value because they can be studied by scientists worldwide using the most powerful analytical instruments available. Furthermore, returned Mars samples can be preserved for studies by future generations of scientists using new techniques and addressing new issues in Mars science. To ensure a high likelihood of success, the first sample return strategy should be simple and focused. We outline a fundamental set of sample requirements and acquisition priorities for Mars sample return.

  7. A robust adaptive sampling method for faster acquisition of MR images.

    PubMed

    Vellagoundar, Jaganathan; Machireddy, Ramasubba Reddy

    2015-06-01

    A robust adaptive k-space sampling method is proposed for faster acquisition and reconstruction of MR images. In this method, undersampling patterns are generated based on magnitude profile of a fully acquired 2-D k-space data. Images are reconstructed using compressive sampling reconstruction algorithm. Simulation experiments are done to assess the performance of the proposed method under various signal-to-noise ratio (SNR) levels. The performance of the method is better than non-adaptive variable density sampling method when k-space SNR is greater than 10dB. The method is implemented on a fully acquired multi-slice raw k-space data and a quality assurance phantom data. Data reduction of up to 60% is achieved in the multi-slice imaging data and 75% is achieved in the phantom imaging data. The results show that reconstruction accuracy is improved over non-adaptive or conventional variable density sampling method. The proposed sampling method is signal dependent and the estimation of sampling locations is robust to noise. As a result, it eliminates the necessity of mathematical model and parameter tuning to compute k-space sampling patterns as required in non-adaptive sampling methods.

  8. Surface studies of plasma processed Nb samples

    SciTech Connect

    Tyagi, Puneet V; Doleans, Marc; Hannah, Brian S; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  9. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, John R.

    1997-01-01

    A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.

  10. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, J.R.

    1997-02-11

    A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.

  11. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  12. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  13. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    NASA Astrophysics Data System (ADS)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.

  14. Oral processing of two milk chocolate samples.

    PubMed

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina

    2013-02-26

    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  15. Oral processing of two milk chocolate samples.

    PubMed

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina

    2013-02-26

    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  16. Developmental Stages in Receptive Grammar Acquisition: A Processability Theory Account

    ERIC Educational Resources Information Center

    Buyl, Aafke; Housen, Alex

    2015-01-01

    This study takes a new look at the topic of developmental stages in the second language (L2) acquisition of morphosyntax by analysing receptive learner data, a language mode that has hitherto received very little attention within this strand of research (for a recent and rare study, see Spinner, 2013). Looking at both the receptive and productive…

  17. Cognitive Skill Acquisition through a Meta-Knowledge Processing Model.

    ERIC Educational Resources Information Center

    McKay, Elspeth

    2002-01-01

    The purpose of this paper is to reopen the discourse on cognitive skill acquisition to focus on the interactive effect of differences in cognitive construct and instructional format. Reports an examination of the contextual issues involved in understanding the interactivity of instructional conditions and cognitive style as a meta-knowledge…

  18. Metadiscursive Processes in the Acquisition of a Second Language.

    ERIC Educational Resources Information Center

    Giacomi, Alain; Vion, Robert

    1986-01-01

    The acquisition of narrative competence in French by an Arabic-speaking migrant worker in interactions with target language speakers was explored, with hypotheses formed about the polyfunctional uses of certain forms to mark the chronology of events in the narrative or to introduce quoted speech. (Author/CB)

  19. Erosion Modeling in Central China - Soil Data Acquisition by Conditioned Latin Hypercube Sampling and Incorporation of Legacy Data

    NASA Astrophysics Data System (ADS)

    Stumpf, Felix; Schönbrodt-Stitt, Sarah; Schmidt, Karsten; Behrens, Thorsten; Scholten, Thomas

    2013-04-01

    The Three Gorges Dam at the Yangtze River in Central China outlines a prominent example of human-induced environmental impacts. Throughout one year the water table at the main river fluctuates about 30m due to impoundment and drainage activities. The dynamic water table implicates a range of georisks such as soil erosion, mass movements, sediment transport and diffuse matter inputs into the reservoir. Within the framework of the joint Sino-German project YANGTZE GEO, the subproject "Soil Erosion" deals with soil erosion risks and sediment transport pathways into the reservoir. The study site is a small catchment (4.8 km²) in Badong, approximately 100 km upstream the dam. It is characterized by scattered plots of agricultural landuse and resettlements in a largely wooded, steep sloping and mountainous area. Our research is focused on data acquisition and processing to develop a process-oriented erosion model. Hereby, area-covering knowledge of specific soil properties in the catchment is an intrinsic input parameter. This will be acquired by means of digital soil mapping (DSM). Thereby, soil properties are estimated by covariates. The functions are calibrated by soil property samples. The DSM approach is based on an appropriate sample design, which reflects the heterogeneity of the catchment, regarding the covariates with influence on the relevant soil properties. In this approach the covariates, processed by a digital terrain analysis, are outlined by the slope, altitude, profile curvature, plane curvature, and the aspect. For the development of the sample design, we chose the Conditioned Latin Hypercube Sampling (cLHS) procedure (Minasny and McBratney, 2006). It provides an efficient method of sampling variables from their multivariate distribution. Thereby, a sample size n from multiple variables is drawn such that for each variable the sample is marginally maximally stratified. The method ensures the maximal stratification by two features: First, number of

  20. Software interface and data acquisition package for the LakeShore cryogenics vibrating sample magnetometer

    SciTech Connect

    O`Dell, B.H.

    1995-11-01

    A software package was developed to replace the software provided by LakeShore for their model 7300 vibrating sample magnetometer (VSM). Several problems with the original software`s functionality caused this group to seek a new software package. The new software utilizes many features that were unsupported in the LakeShore software, including a more functional step mode, point averaging mode, vector moment measurements, and calibration for field offset. The developed software interfaces the VSM through a menu driven graphical user interface, and bypasses the VSM`s on board processor leaving control of the VSM up to the software. The source code for this software is readily available to any one. By having the source, the experimentalist has full control of data acquisition and can add routines specific to their experiment.

  1. Comet assay: rapid processing of multiple samples.

    PubMed

    McNamee, J P; McLean, J R; Ferrarotto, C L; Bellier, P V

    2000-03-01

    The present study describes modifications to the basic comet protocol that increase productivity and efficiency without sacrificing assay reliability. A simple technique is described for rapidly preparing up to 96 comet assay samples simultaneously. The sample preparation technique allows thin layers of agarose-embedded cells to be prepared in multiple wells attached to a flexible film of Gelbond, which improves the ease of manipulating and processing samples. To evaluate the effect of these modifications on assay sensitivity, dose-response curves are presented for DNA damage induced by exposure of TK6 cells to low concentrations of hydrogen peroxide (0-10 microM) and for exposure of human lymphocytes to X-irradiation (0-100 cGy). The limit of detection of DNA damage induced by hydrogen peroxide in TK6 cells was observed to be 1 uM for all parameters (tail ratio, tail moment, tail length and comet length) while the limit of detection of DNA damage in human lymphocytes was 10 cGy for tail and comet length parameters, but 50 cGy for tail ratio and tail moment parameters. These results are similar to those previously reported using the conventional alkaline comet assay. The application of SYBR Gold for detection of DNA damage was compared to that of propidium iodide. Measurements of matching samples for tail length and comet length were similar using both stains. However, comets stained with SYBR Gold persisted longer and were much brighter than those obtained with propidium iodide. SYBR Gold was found to be ideal for measuring tail length and comet length but, under present assay conditions, impractical for measuring tail ratio or tail moment due to saturation of staining in the head region of the comets. PMID:10751727

  2. The Materials Acquisition Process at the University of Technology, Sydney: Equitable Transparent Allocation of Funds.

    ERIC Educational Resources Information Center

    O'Connor, Steve; Flynn, Ann; Lafferty, Susan

    1998-01-01

    Discusses the development of a library acquisition allocation formula at the University of Technology, Sydney. Covers the items included, consultative process adopted, details of the formulae derived and their implementation. (Author)

  3. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  4. Xenbase; core features, data acquisition and data processing

    PubMed Central

    James-Zorn, Christina; Ponferrada, Virgillio G.; Burns, Kevin A.; Fortriede, Joshua D.; Lotay, Vaneet S.; Liu, Yu; Karpinka, J. Brad; Karimi, Kamran; Zorn, Aaron M.; Vize, Peter D.

    2015-01-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query, and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB and Ensembl. Here we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features and the curation of gene nomenclature and gene models. PMID:26150211

  5. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models.

  6. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models. PMID:26150211

  7. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  8. System safety management lessons learned from the US Army acquisition process

    SciTech Connect

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They are broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.

  9. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  10. Cognitive processes during fear acquisition and extinction in animals and humans

    PubMed Central

    Hofmann, Stefan G.

    2007-01-01

    Anxiety disorders are highly prevalent. Fear conditioning and extinction learning in animals often serve as simple models of fear acquisition and exposure therapy of anxiety disorders in humans. This article reviews the empirical and theoretical literature on cognitive processes in fear acquisition, extinction, and exposure therapy. It is concluded that exposure therapy is a form of cognitive intervention that specifically changes the expectancy of harm. Implications for therapy research are discussed. PMID:17532105

  11. An Information Processing Approach to Skill Acquisition: Perception and Timing.

    ERIC Educational Resources Information Center

    Rothstein, Anne L.

    In order to understand learners and players in relation to environments typically found in sport, it is necessary to first understand the individual as an information processor who must sample information from the environment, interpret it, organize or select an appropriate motor response, and execute that response. One of the most difficult…

  12. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  13. Sensor Data Acquisition and Processing Parameters for Human Activity Classification

    PubMed Central

    Bersch, Sebastian D.; Azzi, Djamel; Khusainov, Rinat; Achumba, Ifeyinwa E.; Ries, Jana

    2014-01-01

    It is known that parameter selection for data sampling frequency and segmentation techniques (including different methods and window sizes) has an impact on the classification accuracy. For Ambient Assisted Living (AAL), no clear information to select these parameters exists, hence a wide variety and inconsistency across today's literature is observed. This paper presents the empirical investigation of different data sampling rates, segmentation techniques and segmentation window sizes and their effect on the accuracy of Activity of Daily Living (ADL) event classification and computational load for two different accelerometer sensor datasets. The study is conducted using an ANalysis Of VAriance (ANOVA) based on 32 different window sizes, three different segmentation algorithm (with and without overlap, totaling in six different parameters) and six sampling frequencies for nine common classification algorithms. The classification accuracy is based on a feature vector consisting of Root Mean Square (RMS), Mean, Signal Magnitude Area (SMA), Signal Vector Magnitude (here SMV), Energy, Entropy, FFTPeak, Standard Deviation (STD). The results are presented alongside recommendations for the parameter selection on the basis of the best performing parameter combinations that are identified by means of the corresponding Pareto curve. PMID:24599189

  14. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P. ); Elliott, A. )

    1992-01-01

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  15. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P.; Elliott, A.

    1992-12-31

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  16. Information Processing, Knowledge Acquisition and Learning: Developmental Perspectives.

    ERIC Educational Resources Information Center

    Hoyer, W. J.

    1980-01-01

    Several different conceptions of the relationship between learning and development are considered in this article. It is argued that dialectical and ecological developmental orientations might provide a useful basis for synthesizing the contrasting frameworks of the operant, information processing, learning theory, and knowledge acquisition…

  17. Semantic Context and Graphic Processing in the Acquisition of Reading.

    ERIC Educational Resources Information Center

    Thompson, G. B.

    1981-01-01

    Two experiments provided tests of predictions about children's use of semantic contextual information in reading, under conditions of minimal experience with graphic processes. Subjects, aged 6 1/2, 8, and 11, orally read passages of continuous text with normal and with low semantic constraints under various graphic conditions, including cursive…

  18. Executive and Phonological Processes in Second-Language Acquisition

    ERIC Educational Resources Information Center

    Engel de Abreu, Pascale M. J.; Gathercole, Susan E.

    2012-01-01

    This article reports a latent variable study exploring the specific links among executive processes of working memory, phonological short-term memory, phonological awareness, and proficiency in first (L1), second (L2), and third (L3) languages in 8- to 9-year-olds experiencing multilingual education. Children completed multiple L1-measures of…

  19. Whole-body MR angiography using variable density sampling and dual-injection bolus-chase acquisition.

    PubMed

    Du, Jiang; Korosec, Frank R; Wu, Yijing; Grist, Thomas M; Mistretta, Charles A

    2008-02-01

    Conventional bolus-chase acquisition generates peripheral runoff images using a single injection of the contrast material. Low spatial resolution, small slice coverage and venous contamination are major problems especially in the distal stations. A technique is presented herein in which whole-body magnetic resonance angiography is performed using a dual-contrast-injection four-station acquisition protocol. Bolus sharing was performed between two stations: the abdomen and calf stations share the first bolus injection, while the thorax and thigh stations share the second bolus injection. The combination of variable density sampling and elliptical centric acquisition order was applied to the abdomen and thorax stations. The scan time was extended to generate high spatial resolution arterial phase images with broad slice coverage for the calf and thigh stations. The feasibility of this technique was demonstrated using phantom and in vivo human volunteer studies.

  20. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  1. Data acquisition and online processing requirements for experimentation at the Superconducting Super Collider

    SciTech Connect

    Lankford, A.J.; Barsotti, E.; Gaines, I.

    1989-07-01

    Differences in scale between data acquisition and online processing requirements for detectors at the Superconducting Super Collider and systems for existing large detectors will require new architectures and technological advances in these systems. Emerging technologies will be employed for data transfer, processing, and recording. 9 refs., 3 figs.

  2. A dual process account of coarticulation in motor skill acquisition.

    PubMed

    Shah, Ashvin; Barto, Andrew G; Fagg, Andrew H

    2013-01-01

    Many tasks, such as typing a password, are decomposed into a sequence of subtasks that can be accomplished in many ways. Behavior that accomplishes subtasks in ways that are influenced by the overall task is often described as "skilled" and exhibits coarticulation. Many accounts of coarticulation use search methods that are informed by representations of objectives that define skilled. While they aid in describing the strategies the nervous system may follow, they are computationally complex and may be difficult to attribute to brain structures. Here, the authors present a biologically- inspired account whereby skilled behavior is developed through 2 simple processes: (a) a corrective process that ensures that each subtask is accomplished, but does not do so skillfully and (b) a reinforcement learning process that finds better movements using trial and error search that is not informed by representations of any objectives. We implement our account as a computational model controlling a simulated two-armed kinematic "robot" that must hit a sequence of goals with its hands. Behavior displays coarticulation in terms of which hand was chosen, how the corresponding arm was used, and how the other arm was used, suggesting that the account can participate in the development of skilled behavior. PMID:24116847

  3. Phases of learning: How skill acquisition impacts cognitive processing.

    PubMed

    Tenison, Caitlin; Fincham, Jon M; Anderson, John R

    2016-06-01

    This fMRI study examines the changes in participants' information processing as they repeatedly solve the same mathematical problem. We show that the majority of practice-related speedup is produced by discrete changes in cognitive processing. Because the points at which these changes take place vary from problem to problem, and the underlying information processing steps vary in duration, the existence of such discrete changes can be hard to detect. Using two converging approaches, we establish the existence of three learning phases. When solving a problem in one of these learning phases, participants can go through three cognitive stages: Encoding, Solving, and Responding. Each cognitive stage is associated with a unique brain signature. Using a bottom-up approach combining multi-voxel pattern analysis and hidden semi-Markov modeling, we identify the duration of that stage on any particular trial from participants brain activation patterns. For our top-down approach we developed an ACT-R model of these cognitive stages and simulated how they change over the course of learning. The Solving stage of the first learning phase is long and involves a sequence of arithmetic computations. Participants transition to the second learning phase when they can retrieve the answer, thereby drastically reducing the duration of the Solving stage. With continued practice, participants then transition to the third learning phase when they recognize the problem as a single unit and produce the answer as an automatic response. The duration of this third learning phase is dominated by the Responding stage.

  4. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  5. High throughput sample processing and automated scoring.

    PubMed

    Brunborg, Gunnar; Jackson, Petra; Shaposhnikov, Sergey; Dahl, Hildegunn; Azqueta, Amaya; Collins, Andrew R; Gutzkow, Kristine B

    2014-01-01

    The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput (HT) modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to HT are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. HT methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies), and automation gives more uniform sample treatment and less dependence on operator performance. The HT modifications now available vary largely in their versatility, capacity, complexity, and costs. The bottleneck for further increase of throughput appears to be the scoring. PMID:25389434

  6. Multi-channel high-speed CMOS image acquisition and pre-processing system

    NASA Astrophysics Data System (ADS)

    Sun, Chun-feng; Yuan, Feng; Ding, Zhen-liang

    2008-10-01

    A new multi-channel high-speed CMOS image acquisition and pre-processing system is designed to realize the image acquisition, data transmission, time sequential control and simple image processing by high-speed CMOS image sensor. The modular structure design, LVDS and ping-pong cache techniques used during the designed image data acquisition sub-system design ensure the real-time data acquisition and transmission. Furthermore, a new histogram equalization algorithm of adaptive threshold value based on the reassignment of redundant gray level is incorporated in the image preprocessing module of FPGA. The iterative method is used in the course of setting threshold value, and a redundant graylevel is redistributed rationally according to the proportional gray level interval. The over-enhancement of background is restrained and the feasibility of mergence of foreground details is reduced. The experimental certificates show that the system can be used to realize the image acquisition, transmission, memory and pre-processing to 590MPixels/s data size, and make for the design and realization of the subsequent system.

  7. Acquisition times of carrier tracking sampled data phase-locked loops

    NASA Technical Reports Server (NTRS)

    Aguirre, S.

    1986-01-01

    Phase acquisition times of type II and III loops typical of the Advanced Receiver are studied by computer simulations when the loops are disturbed by gaussian noise. Reliable estimates are obtained by running 5000 trials for each combination of loop signal-to-noise ratio (SNR) and frequency offset. The probabilities of acquisition are shown versus time from start of acquisition for various loop SNRs and frequency offsets. For frequency offsets smaller than one-fourth of the loop bandwidth and for loop SNRs of 10 dB and higher, the loops acquire with probability 0.99 within 2.5 B sub L for type II loops and within 7/B sub L for type III loops.

  8. The acquisition of integrated science process skills in a web-based learning environment

    NASA Astrophysics Data System (ADS)

    Saat, Rohaida Mohd.

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among grade 5 children. Data were gathered primarily from children's conversations and teacher-student conversations. Analysis of the data revealed that the children acquired the skill in three phases: from the phase of recognition to the phase of familiarization and finally to the phase of automation. Nevertheless, the acquisition of the skill only involved the acquisition of certain subskills of the skill of controlling variables. This progression could be influenced by the web-based instructional material that provided declarative knowledge, concrete visualization and opportunities for practise.

  9. A Future Vision of a Data Acquisition: Distributed Sensing, Processing, and Health Monitoring

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Solano, Wanda; Thurman, Charles; Schmalzel, John

    2000-01-01

    This paper presents a vision fo a highly enhanced data acquisition and health monitoring system at NASA Stennis Space Center (SSC) rocket engine test facility. This vision includes the use of advanced processing capabilities in conjunction with highly autonomous distributed sensing and intelligence, to monitor and evaluate the health of data in the context of it's associated process. This method is expected to significantly reduce data acquisitions costs and improve system reliability. A Universal Signal Conditioning Amplifier (USCA) based system, under development at Kennedy Space Center, is being evaluated for adaptation to the SSC testing infrastructure. Kennedy's USCA architecture offers many advantages including flexible and auto-configuring data acquisition with improved calibration and verifiability. Possible enhancements at SSC may include multiplexing the distributed USCAs to reduce per channel cost, and the use of IEEE-485 to Allen-Bradley Control Net Gateways for interfacing with the resident control systems.

  10. Quantitative modal determination of geological samples based on X-ray multielemental map acquisition.

    PubMed

    Cossio, Roberto; Borghi, Alessandro; Ruffini, Raffaella

    2002-04-01

    Multielemental X-ray maps collected by a remote scanning system of the electron beam are processed by a dedicated software program performing accurate modal determination of geological samples. The classification of different mineral phases is based on elemental concentrations. The software program Petromod loads the maps into a database and computes a matrix consisting of numerical values proportional to the elemental concentrations. After an initial calibration, the program can perform the chemical composition calculated on the basis of a fixed number of oxygens for a selected area. In this way, it is possible to identify all the mineral phases occurring in the sample. Up to three elements can be selected to calculate the modal percentage of the identified mineral. An automated routine scans the whole set of maps and assigns each pixel that satisfies the imposed requirements to the selected phase. Repeating this procedure for every mineral phase occurring in the mapped area, a modal distribution of the rock-forming minerals can be performed. The final output consists of a digitized image, which can be further analyzed by common image analysis software, and a table containing the calculated modal percentages. The method is here applied to a volcanic and a metamorphic rock sample. PMID:12533243

  11. Quantitative modal determination of geological samples based on X-ray multielemental map acquisition.

    PubMed

    Cossio, Roberto; Borghi, Alessandro; Ruffini, Raffaella

    2002-04-01

    Multielemental X-ray maps collected by a remote scanning system of the electron beam are processed by a dedicated software program performing accurate modal determination of geological samples. The classification of different mineral phases is based on elemental concentrations. The software program Petromod loads the maps into a database and computes a matrix consisting of numerical values proportional to the elemental concentrations. After an initial calibration, the program can perform the chemical composition calculated on the basis of a fixed number of oxygens for a selected area. In this way, it is possible to identify all the mineral phases occurring in the sample. Up to three elements can be selected to calculate the modal percentage of the identified mineral. An automated routine scans the whole set of maps and assigns each pixel that satisfies the imposed requirements to the selected phase. Repeating this procedure for every mineral phase occurring in the mapped area, a modal distribution of the rock-forming minerals can be performed. The final output consists of a digitized image, which can be further analyzed by common image analysis software, and a table containing the calculated modal percentages. The method is here applied to a volcanic and a metamorphic rock sample.

  12. Characteristics of Marijuana Acquisition among a National Sample of Adolescent Users

    ERIC Educational Resources Information Center

    King, Keith A.; Merianos, Ashley L.; Vidourek, Rebecca A.

    2016-01-01

    Background: Because marijuana is becoming more accessible and perceived norms of use are becoming increasingly more favorable, research is needed to understand characteristics of marijuana acquisition among adolescents. Purpose: The study purpose was to examine whether sources and locations where adolescent users obtain and use marijuana differed…

  13. Effect of Intermittent Reinforcement on Acquisition and Retention in Delayed Matching-to-Sample in Pigeons

    ERIC Educational Resources Information Center

    Grant, Douglas S.

    2011-01-01

    Experiments 1 and 2 involved independent groups that received primary reinforcement after a correct match with a probability of 1.0, 0.50 or 0.25. Correct matches that did not produce primary reinforcement produced a conditioned reinforcer. Both experiments revealed little evidence that acquisition or retention was adversely affected by use of…

  14. Learning and Individual Differences: An Ability/Information-Processing Framework for Skill Acquisition. Final Report.

    ERIC Educational Resources Information Center

    Ackerman, Phillip L.

    A program of theoretical and empirical research focusing on the ability determinants of individual differences in skill acquisition is reviewed. An integrative framework for information-processing and cognitive ability determinants of skills is reviewed, along with principles for ability-skill relations. Experimental manipulations were used to…

  15. The Priority of Listening Comprehension over Speaking in the Language Acquisition Process

    ERIC Educational Resources Information Center

    Xu, Fang

    2011-01-01

    By elaborating the definition of listening comprehension, the characteristic of spoken discourse, the relationship between STM and LTM and Krashen's comprehensible input, the paper puts forward the point that the priority of listening comprehension over speaking in the language acquisition process is very necessary.

  16. Processes of Language Acquisition in Children with Autism: Evidence from Preferential Looking

    ERIC Educational Resources Information Center

    Swensen, Lauren D.; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R.

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency…

  17. Individual Variation in Infant Speech Processing: Implications for Language Acquisition Theories

    ERIC Educational Resources Information Center

    Cristia, Alejandrina

    2009-01-01

    To what extent does language acquisition recruit domain-general processing mechanisms? In this dissertation, evidence concerning this question is garnered from the study of individual differences in infant speech perception and their predictive value with respect to language development in early childhood. In the first experiment, variation in the…

  18. Monitoring of HTS compound library quality via a high-resolution image acquisition and processing instrument.

    PubMed

    Baillargeon, Pierre; Scampavia, Louis; Einsteder, Ross; Hodder, Peter

    2011-06-01

    This report presents the high-resolution image acquisition and processing instrument for compound management applications (HIAPI-CM). The HIAPI-CM combines imaging spectroscopy and machine-vision analysis to perform rapid assessment of high-throughput screening (HTS) compound library quality. It has been customized to detect and classify typical artifacts found in HTS compound library microtiter plates (MTPs). These artifacts include (1) insufficient volume of liquid compound sample, (2) compound precipitation, and (3) colored compounds that interfere with HTS assay detection format readout. The HIAPI-CM is also configured to automatically query and compare its analysis results to data stored in a LIMS or corporate database, aiding in the detection of compound registration errors. To demonstrate its capabilities, several compound plates (n=5760 wells total) containing different artifacts were measured via automated HIAPI-CM analysis, and results compared with those obtained by manual (visual) inspection. In all cases, the instrument demonstrated high fidelity (99.8% empty wells; 100.1% filled wells; 94.4% for partially filled wells; 94.0% for wells containing colored compounds), and in the case of precipitate detection, the HIAPI-CM results significantly exceeded the fidelity of visual observations (220.0%). As described, the HIAPI-CM allows for noninvasive, nondestructive MTP assessment with a diagnostic throughput of about 1min per plate, reducing analytical expenses and improving the quality and stewardship of HTS compound libraries.

  19. Transient Decline in Hippocampal Theta Activity during the Acquisition Process of the Negative Patterning Task

    PubMed Central

    Sakimoto, Yuya; Okada, Kana; Takeda, Kozue; Sakata, Shogo

    2013-01-01

    Hippocampal function is important in the acquisition of negative patterning but not of simple discrimination. This study examined rat hippocampal theta activity during the acquisition stages (early, middle, and late) of the negative patterning task (A+, B+, AB-). The results showed that hippocampal theta activity began to decline transiently (for 500 ms after non-reinforced stimulus presentation) during the late stage of learning in the negative patterning task. In addition, this transient decline in hippocampal theta activity in the late stage was lower in the negative patterning task than in the simple discrimination task. This transient decline during the late stage of task acquisition may be related to a learning process distinctive of the negative patterning task but not the simple discrimination task. We propose that the transient decline of hippocampal theta activity reflects inhibitory learning and/or response inhibition after the presentation of a compound stimulus specific to the negative patterning task. PMID:23936249

  20. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  1. Performance of a VME-based parallel processing LIDAR data acquisition system (summary)

    SciTech Connect

    Moore, K.; Buttler, B.; Caffrey, M.; Soriano, C.

    1995-05-01

    It may be possible to make accurate real time, autonomous, 2 and 3 dimensional wind measurements remotely with an elastic backscatter Light Detection and Ranging (LIDAR) system by incorporating digital parallel processing hardware into the data acquisition system. In this paper, we report the performance of a commercially available digital parallel processing system in implementing the maximum correlation technique for wind sensing using actual LIDAR data. Timing and numerical accuracy are benchmarked against a standard microprocessor impementation.

  2. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. PMID:26720258

  3. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing.

  4. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  5. [Software development of multi-element transient signal acquisition and processing with multi-channel ICP-AES].

    PubMed

    Zhang, Y; Zhuang, Z; Wang, X; Zhu, E; Liu, J

    2000-02-01

    A software for multi-channel ICP-AES multi-element transient signal acquisition and processing were developed in this paper. It has been successfully applied to signal acquisition and processing in many transient introduction techniques on-line hyphenated with multi-channel ICP-AES.

  6. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-01

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  7. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees.

  8. Summary of the activities of the subgroup on data acquisition and processing

    SciTech Connect

    Connolly, P.L.; Doughty, D.C.; Elias, J.E.

    1981-01-01

    A data acquisition and handling subgroup consisting of approximately 20 members met during the 1981 ISABELLE summer study. Discussions were led by members of the BNL ISABELLE Data Acquisition Group (DAG) with lively participation from outside users. Particularly large contributions were made by representatives of BNL experiments 734, 735, and the MPS, as well as the Fermilab Colliding Detector Facility and the SLAC LASS Facility. In contrast to the 1978 study, the subgroup did not divide its activities into investigations of various individual detectors, but instead attempted to review the current state-of-the-art in the data acquisition, trigger processing, and data handling fields. A series of meetings first reviewed individual pieces of the problem, including status of the Fastbus Project, the Nevis trigger processor, the SLAC 168/E and 3081/E emulators, and efforts within DAG. Additional meetings dealt with the question involving specifying and building complete data acquisition systems. For any given problem, a series of possible solutions was proposed by the members of the subgroup. In general, any given solution had both advantages and disadvantages, and there was never any consensus on which approach was best. However, there was agreement that certain problems could only be handled by systems of a given power or greater. what will be given here is a review of various solutions with associated powers, costs, advantages, and disadvantages.

  9. Instrumental improvements and sample preparations that enable reproducible, reliable acquisition of mass spectra from whole bacterial cells

    PubMed Central

    Alusta, Pierre; Buzatu, Dan; Williams, Anna; Cooper, Willie-Mae; Tarasenko, Olga; Dorey, R Cameron; Hall, Reggie; Parker, W Ryan; Wilkes, Jon G

    2015-01-01

    Rationale Rapid sub-species characterization of pathogens is required for timely responses in outbreak situations. Pyrolysis mass spectrometry (PyMS) has the potential to be used for this purpose. Methods However, in order to make PyMS practical for traceback applications, certain improvements related to spectrum reproducibility and data acquisition speed were required. The main objectives of this study were to facilitate fast detection (<30 min to analyze 6 samples, including preparation) and sub-species-level bacterial characterization based on pattern recognition of mass spectral fingerprints acquired from whole cells volatilized and ionized at atmospheric pressure. An AccuTOF DART mass spectrometer was re-engineered to permit ionization of low-volatility bacteria by means of Plasma Jet Ionization (PJI), in which an electric discharge, and, by extension, a plasma beam, impinges on sample cells. Results Instrumental improvements and spectral acquisition methodology are described. Performance of the re-engineered system was assessed using a small challenge set comprised of assorted bacterial isolates differing in identity by varying amounts. In general, the spectral patterns obtained allowed differentiation of all samples tested, including those of the same genus and species but different serotypes. Conclusions Fluctuations of ±15% in bacterial cell concentrations did not substantially compromise replicate spectra reproducibility. © 2015 National Center for Toxicological Research. Rapid Communications in Mass Spectrometry published by John Wiley & Sons Ltd. PMID:26443394

  10. Valve For Extracting Samples From A Process Stream

    NASA Technical Reports Server (NTRS)

    Callahan, Dave

    1995-01-01

    Valve for extracting samples from process stream includes cylindrical body bolted to pipe that contains stream. Opening in valve body matched and sealed against opening in pipe. Used to sample process streams in variety of facilities, including cement plants, plants that manufacture and reprocess plastics, oil refineries, and pipelines.

  11. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  12. Data acquisition, processing and firing aid software for multichannel EMP simulation

    NASA Astrophysics Data System (ADS)

    Eumurian, Gregoire; Arbaud, Bruno

    1986-08-01

    Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.

  13. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    SciTech Connect

    Schickert, Martin

    2015-03-31

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  14. Revised sampling campaigns to provide sludge for treatment process testing

    SciTech Connect

    PETERSEN, C.A.

    1999-02-18

    The purpose of this document is to review the impact to the sludge sampling campaigns planned for FY 1999 given the recent decision to delete any further sludge sampling in the K West Basin. Requirements for Sludge sample material for Sludge treatment process testing are reviewed. Options are discussed for obtaining the volume sample material required and an optimized plan for obtaining this sludge is summarized.

  15. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  16. The Earthscope USArray Array Network Facility (ANF): Evolution of Data Acquisition, Processing, and Storage Systems

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Battistuz, B.; Foley, S.; Vernon, F. L.; Eakins, J. A.

    2009-12-01

    Since April 2004 the Earthscope USArray Transportable Array (TA) network has grown to over 400 broadband seismic stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. In total, over 1.7 terabytes per year of 24-bit, 40 samples-per-second seismic and state of health data is recorded from the stations. The ANF provides analysts access to real-time and archived data, as well as state-of-health data, metadata, and interactive tools for station engineers and the public via a website. Additional processing and recovery of missing data from on-site recorders (balers) at the stations is performed before the final data is transmitted to the IRIS Data Management Center (DMC). Assembly of the final data set requires additional storage and processing capabilities to combine the real-time data with baler data. The infrastructure supporting these diverse computational and storage needs currently consists of twelve virtualized Sun Solaris Zones executing on nine physical server systems. The servers are protected against failure by redundant power, storage, and networking connections. Storage needs are provided by a hybrid iSCSI and Fiber Channel Storage Area Network (SAN) with access to over 40 terabytes of RAID 5 and 6 storage. Processing tasks are assigned to systems based on parallelization and floating-point calculation needs. On-site buffering at the data-loggers provide protection in case of short-term network or hardware problems, while backup acquisition systems at the San Diego Supercomputer Center and the DMC protect against catastrophic failure of the primary site. Configuration management and monitoring of these systems is accomplished with open-source (Cfengine, Nagios, Solaris Community Software) and commercial tools (Intermapper). In the evolution from a single server to multiple virtualized server instances, Sun Cluster software was evaluated and found to be unstable in our environment. Shared filesystem

  17. Process to process communication over Fastbus in the data acquisition system of the ALEPH TPC

    SciTech Connect

    Lusiani, A. . Division PPE Scuola Normale Superiore, Pisa )

    1994-02-01

    The data acquisition system of the ALEPH TPC includes a VAX/VMS computer cluster and 36 intelligent Fastbus modules (ALEPH TPPS) running the OS9 multitasking real-time operating system. Dedicated software has been written in order to reliably exchange information over Fastbus between the VAX/VMS cluster and the 36 TPPs to initialize and co-ordinate the microprocessors, and to monitor and debug their operation. The functionality and the performance of this software will be presented together with an overview of the application that rely on it.

  18. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  19. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  20. Phonological processing in deaf signers and the impact of age of first language acquisition.

    PubMed

    MacSweeney, Mairéad; Waters, Dafydd; Brammer, Michael J; Woll, Bencie; Goswami, Usha

    2008-04-15

    Just as words can rhyme, the signs of a signed language can share structural properties, such as location. Linguistic description at this level is termed phonology. We report that a left-lateralised fronto-parietal network is engaged during phonological similarity judgements made in both English (rhyme) and British Sign Language (BSL; location). Since these languages operate in different modalities, these data suggest that the neural network supporting phonological processing is, to some extent, supramodal. Activation within this network was however modulated by language (BSL/English), hearing status (deaf/hearing), and age of BSL acquisition (native/non-native). The influence of language and hearing status suggests an important role for the posterior portion of the left inferior frontal gyrus in speech-based phonological processing in deaf people. This, we suggest, is due to increased reliance on the articulatory component of speech when the auditory component is absent. With regard to age of first language acquisition, non-native signers activated the left inferior frontal gyrus more than native signers during the BSL task, and also during the task performed in English, which both groups acquired late. This is the first neuroimaging demonstration that age of first language acquisition has implications not only for the neural systems supporting the first language, but also for networks supporting languages learned subsequently.

  1. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  2. Autonomous Closed-Loop Tasking, Acquisition, Processing, and Evaluation for Situational Awareness Feedback

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Dan; Cappelaere, Pat

    2016-01-01

    This presentation describes the closed loop satellite autonomy methods used to connect users and the assets on Earth Orbiter- 1 (EO-1) and similar satellites. The base layer is a distributed architecture based on Goddard Mission Services Evolution Concept (GMSEC) thus each asset still under independent control. Situational awareness is provided by a middleware layer through common Application Programmer Interface (API) to GMSEC components developed at GSFC. Users setup their own tasking requests, receive views into immediate past acquisitions in their area of interest, and into future feasibilities for acquisition across all assets. Automated notifications via pubsub feeds are returned to users containing published links to image footprints, algorithm results, and full data sets. Theme-based algorithms are available on-demand for processing.

  3. On the Contrastive Analysis of Features in Second Language Acquisition: Uninterpretable Gender on Past Participles in English-French Processing

    ERIC Educational Resources Information Center

    Dekydtspotter, Laurent; Renaud, Claire

    2009-01-01

    Lardiere's discussion raises important questions about the use of features in second language (L2) acquisition. This response examines predictions for processing of a feature-valuing model vs. a frequency-sensitive, associative model in explaining the acquisition of French past participle agreement. Results from a reading-time experiment support…

  4. Sample Handling and Processing on Mars for Future Astrobiology Missions

    NASA Technical Reports Server (NTRS)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  5. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  6. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    PubMed

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR.

  7. Double Shell Tank (DST) Process Waste Sampling Subsystem Specification

    SciTech Connect

    RASMUSSEN, J.H.

    2000-05-03

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Process Waste Sampling Subsystem which supports the first phase of Waste Feed Delivery.

  8. Users' perceptions of the impact of electronic aids to daily living throughout the acquisition process.

    PubMed

    Ripat, Jacquie; Strock, Anne

    2004-01-01

    This study investigated the experience of seven new users of a particular type of assistive technology through the stages of anticipating, acquiring, and using an electronic aid to daily living. A mixed methods research approach was used to explore each of these stages. The Psychosocial Impact of Assistive Devices Scale was used to measure the perceived impact of the new assistive technology on users' quality of life, and findings were further explored and developed through open-ended questioning of the participants. Results indicated that preacquisition of the device, users predicted that the electronic aid to daily living would have a positive impact on their feelings of competence and confidence and that the device would enable them in a positive way. One month after acquiring the device a reduced, yet still positive, impact was observed. By 3 and 6 months after acquisition, perceived impact returned to the same positive high level as preacquisition. It is suggested that prior to receiving the device, potential users have positive expectations for the device that are not based in experience. At the early acquisition time, users adjust expectations of the role of the assistive technology in their lives and strive to balance expectations with reality. Three to 6 months after acquiring an electronic aid to daily living, the participants have a high positive view of how the device impacts on their lives based in experience and reality. A model illustrating the electronic aids to daily living acquisition process is proposed, and suggestions for future study are provided.

  9. Differential sampling for fast frequency acquisition via adaptive extended least squares algorithm

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra

    1987-01-01

    This paper presents a differential signal model along with appropriate sampling techinques for least squares estimation of the frequency and frequency derivatives and possibly the phase and amplitude of a sinusoid received in the presence of noise. The proposed algorithm is recursive in mesurements and thus the computational requirement increases only linearly with the number of measurements. The dimension of the state vector in the proposed algorithm does not depend upon the number of measurements and is quite small, typically around four. This is an advantage when compared to previous algorithms wherein the dimension of the state vector increases monotonically with the product of the frequency uncertainty and the observation period. Such a computational simplification may possibly result in some loss of optimality. However, by applying the sampling techniques of the paper such a possible loss in optimality can made small.

  10. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees. PMID:19727866

  11. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future.

  12. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future. PMID:25430050

  13. Memory acquisition and retrieval impact different epigenetic processes that regulate gene expression

    PubMed Central

    2015-01-01

    Background A fundamental question in neuroscience is how memories are stored and retrieved in the brain. Long-term memory formation requires transcription, translation and epigenetic processes that control gene expression. Thus, characterizing genome-wide the transcriptional changes that occur after memory acquisition and retrieval is of broad interest and importance. Genome-wide technologies are commonly used to interrogate transcriptional changes in discovery-based approaches. Their ability to increase scientific insight beyond traditional candidate gene approaches, however, is usually hindered by batch effects and other sources of unwanted variation, which are particularly hard to control in the study of brain and behavior. Results We examined genome-wide gene expression after contextual conditioning in the mouse hippocampus, a brain region essential for learning and memory, at all the time-points in which inhibiting transcription has been shown to impair memory formation. We show that most of the variance in gene expression is not due to conditioning and that by removing unwanted variance through additional normalization we are able provide novel biological insights. In particular, we show that genes downregulated by memory acquisition and retrieval impact different functions: chromatin assembly and RNA processing, respectively. Levels of histone 2A variant H2AB are reduced only following acquisition, a finding we confirmed using quantitative proteomics. On the other hand, splicing factor Rbfox1 and NMDA receptor-dependent microRNA miR-219 are only downregulated after retrieval, accompanied by an increase in protein levels of miR-219 target CAMKIIγ. Conclusions We provide a thorough characterization of coding and non-coding gene expression during long-term memory formation. We demonstrate that unwanted variance dominates the signal in transcriptional studies of learning and memory and introduce the removal of unwanted variance through normalization as a

  14. The birds and the beans: a low-fidelity simulator for chorionic villus sampling skill acquisition.

    PubMed

    Wax, Joseph R; Cartin, Angelina; Pinette, Michael G

    2012-08-01

    Because no simulation models are described for chorionic villus sampling (CVS), we sought to design and construct a CVS training simulator. Using materials available from our labor floor and local supermarket, we built and demonstrated a practical model for learning transabdominal and transcervical CVS. The simulator can be used to teach single- or dual-operator transabdominal CVS and traditional transcervical CVS. Aspirated "villi" immediately inform the teacher and learner of successful procedures. No image degradation or sonographically visible tracks resulted from use, permitting more than one trainee to benefit from a model. This model for transabdominal and transcervical CVS provides realistic imaging, tactile sensations, and immediate feedback.

  15. Acquisition of a High Resolution Field Emission Scanning Electron Microscope for the Analysis of Returned Samples

    NASA Technical Reports Server (NTRS)

    Nittler, Larry R.

    2003-01-01

    This grant furnished funds to purchase a state-of-the-art scanning electron microscope (SEM) to support our analytical facilities for extraterrestrial samples. After evaluating several instruments, we purchased a JEOL 6500F thermal field emission SEM with the following analytical accessories: EDAX energy-dispersive x-ray analysis system with fully automated control of instrument and sample stage; EDAX LEXS wavelength-dispersive x-ray spectrometer for high sensitivity light-element analysis; EDAX/TSL electron backscatter diffraction (EBSD) system with software for phase identification and crystal orientation mapping; Robinson backscatter electron detector; and an in situ micro-manipulator (Kleindiek). The total price was $550,000 (with $150,000 of the purchase supported by Carnegie institution matching funds). The microscope was delivered in October 2002, and most of the analytical accessories were installed by January 2003. With the exception of the wavelength spectrometer (which has been undergoing design changes) everything is working well and the SEM is in routine use in our laboratory.

  16. Problem solving in nursing practice: application, process, skill acquisition and measurement.

    PubMed

    Roberts, J D; While, A E; Fitzpatrick, J M

    1993-06-01

    This paper analyses the role of problem solving in nursing practice including the process, acquisition and measurement of problem-solving skills. It is argued that while problem-solving ability is acknowledged as critical if today's nurse practitioner is to maintain effective clinical practice, to date it retains a marginal place in nurse education curricula. Further, it has attracted limited empirical study. Such an omission, it is argued, requires urgent redress if the nursing profession is to meet effectively the challenges of the next decade and beyond.

  17. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  18. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  19. Acquisition and Processing of Multi-Fold GPR Data for Characterization of Shallow Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Bradford, J. H.

    2004-05-01

    Most ground-penetrating radar (GPR) data are acquired with a constant transmitter-receiver offset and often investigators apply little or no processing in generating a subsurface image. This mode of operation can provide useful information, but does not take full advantage of the information the GPR signal can carry. In continuous multi-offset (CMO) mode, one acquires several traces with varying source-receiver separations at each point along the survey. CMO acquisition is analogous to common-midpoint acquisition in exploration seismology and gives rise to improved subsurface characterization through three key features: 1) Processes such as stacking and velocity filtering significantly attenuate coherent and random noise resulting in subsurface images that are easier to interpret, 2) CMO data enable measurement of vertical and lateral velocity variations which leads to improved understanding of material distribution and more accurate depth estimates, and 3) CMO data enable observation of reflected wave behaviour (ie variations in amplitude and spectrum) at a common reflection point for various travel paths through the subsurface - quantification of these variations can be a valuable tool in material property characterization. Although there are a few examples in the literature, investigators rarely acquire CMO GPR data. This is, in large part, due to the fact that CMO acquisition with a single channel system is labor intensive and time consuming. At present, no multi-channel GPR systems designed for CMO acquisition are commercially available. Over the past 8 years I have designed, conducted, and processed numerous 2D and 3D CMO GPR surveys using a single channel GPR system. I have developed field procedures that enable a three man crew to acquire CMO GPR data at a rate comparable to a similar scale multi-channel seismic reflection survey. Additionally, many recent advances in signal processing developed in the oil and gas industry have yet to see significant

  20. Sampling state and process variables on coral reefs.

    PubMed

    Green, Roger H; McArdle, Brian A; van Woesik, Robert

    2011-07-01

    Contemporary coral reefs are forced to survive through and recover from disturbances at a variety of spatial and temporal scales. Understanding disturbances in the context of ecological processes may lead to accurate predictive models of population trajectories. Most coral-reef studies and monitoring programs examine state variables, which include the percentage coverage of major benthic organisms, but few studies examine the key ecological processes that drive the state variables. Here we outline a sampling strategy that captures both state and process variables, at a spatial scale of tens of kilometers. Specifically, we are interested in (1) examining spatial and temporal patterns in coral population size-frequency distributions, (2) determining major population processes, including rates of recruitment and mortality, and (3) examining relationships between processes and state variables. Our effective sampling units are randomly selected 75 × 25 m stations, spaced approximately 250-500 m apart, representing a 10(3) m spatial scale. Stations are nested within sites, spaced approximately 2 km apart, representing a 10(4) m spatial scale. Three randomly selected 16 m(2) quadrats placed in each station and marked for relocation are used to assess processes across time, while random belt-transects, re-randomized at each sampling event, are used to sample state variables. Both quadrats and belt-transects are effectively sub-samples from which we will derive estimates of means for each station at each sampling event. This nested sampling strategy allows us to determine critical stages in populations, examine population performance, and compare processes through disturbance events and across regions.

  1. Liquid crystal materials and structures for image processing and 3D shape acquisition

    NASA Astrophysics Data System (ADS)

    Garbat, K.; Garbat, P.; Jaroszewicz, L.

    2012-03-01

    The image processing supported by liquid crystals device has been used in numerous imaging applications, including polarization imaging, digital holography and programmable imaging. Liquid crystals have been extensively studied and are massively used in display and optical processing technology. We present here the main relevant parameters of liquid crystal for image processing and 3D shape acquisition and we compare the main liquid crystal options which can be used with their respective advantages. We propose here to compare performance of several types of liquid crystal materials: nematic mixtures with high and medium optical and dielectrical anisotropies and relatively low rotational viscosities nematic materials which may operate in TN mode in mono and dual frequency addressing systems.

  2. Preliminary study of the EChO data sampling and processing

    NASA Astrophysics Data System (ADS)

    Farina, M.; Di Giorgio, A. M.; Focardi, M.; Pace, E.; Micela, G.; Galli, E.; Giusi, G.; Liu, S. J.; Pezzuto, S.

    2014-08-01

    The EChO Payload is an integrated spectrometer with six different channels covering the spectral range from the visible up to the thermal infrared. A common Instrument Control Unit (ICU) implements all the instrument control and health monitoring functionalities as well as all the onboard science data processing. To implement an efficient design of the ICU on board software, separate analysis of the unit requirements are needed for the commanding and housekeeping collection as well as for the data acquisition, sampling and compression. In this work we present the results of the analysis carried out to optimize the EChO data acquisition and processing chain. The HgCdTe detectors used for EChO mission allow for non-destructive readout modes, such that the charge may be read without removing it after reading out. These modes can reduce the equivalent readout noise and the gain in signal to noise ratio can be computed using well known relations based on fundamental principles. In particular, we considered a multiaccumulation approach based on non-destructive reading of detector samples taken at equal time intervals. All detectors are periodically reset after a certain number of samples have been acquired and the length of the reset interval, as well as the number of samples and the sampling rate can be adapted to the brightness of the considered source. The estimation of the best set of parameters for the signal to noise ratio optimization and of the best sampling technique has been done by taking into account also the needs of mitigating the expected radiation effects on the acquired data. Cosmic rays can indeed be one of the major sources of data loss for a space observatory, and the studies made for the JWST mission allowed us to evaluate the actual need of the implementation of a dedicated deglitching procedure on board EChO.

  3. An Integrated Data Acquisition / User Request/ Processing / Delivery System for Airborne Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Chapman, B.; Chu, A.; Tung, W.

    2003-12-01

    Airborne science data has historically played an important role in the development of the scientific underpinnings for spaceborne missions. When the science community determines the need for new types of spaceborne measurements, airborne campaigns are often crucial in risk mitigation for these future missions. However, full exploitation of the acquired data may be difficult due to its experimental and transitory nature. Externally to the project, most problematic (in particular, for those not involved in requesting the data acquisitions) may be the difficulty in searching for, requesting, and receiving the data, or even knowing the data exist. This can result in a rather small, insular community of users for these data sets. Internally, the difficulty for the project is in maintaining a robust processing and archival system during periods of changing mission priorities and evolving technologies. The NASA/JPL Airborne Synthetic Aperture Radar (AIRSAR) has acquired data for a large and varied community of scientists and engineers for 15 years. AIRSAR is presently supporting current NASA Earth Science Enterprise experiments, such as the Soil Moisture EXperiment (SMEX) and the Cold Land Processes experiment (CLPX), as well as experiments conducted as many as 10 years ago. During that time, it's processing, data ordering, and data delivery system has undergone evolutionary change as the cost and capability of resources has improved. AIRSAR now has a fully integrated data acquisition/user request/processing/delivery system through which most components of the data fulfillment process communicate via shared information within a database. The integration of these functions has reduced errors and increased throughput of processed data to customers.

  4. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  5. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  6. A multiple process solution to the logical problem of language acquisition.

    PubMed

    MacWhinney, Brian

    2004-11-01

    Many researchers believe that there is a logical problem at the centre of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load.

  7. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    SciTech Connect

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from the two locations

  8. Apparatus and process for collection of gas and vapor samples

    DOEpatents

    Jackson, Dennis G.; Peterson, Kurt D.; Riha, Brian D.

    2008-04-01

    A gas sampling apparatus and process is provided in which a standard crimping tool is modified by an attached collar. The collar permits operation of the crimping tool while also facilitating the introduction of a supply of gas to be introduced into a storage vial. The introduced gas supply is used to purge ambient air from a collection chamber and an interior of the sample vial. Upon completion of the purging operation, the vial is sealed using the crimping tool.

  9. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  10. A review of breast tomosynthesis. Part I. The image acquisition process

    SciTech Connect

    Sechopoulos, Ioannis

    2013-01-15

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

  11. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach.

  12. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach. PMID:27561252

  13. Acquisition of material properties in production for sheet metal forming processes

    SciTech Connect

    Heingärtner, Jörg; Hora, Pavel; Neumann, Anja; Hortig, Dirk; Rencki, Yasar

    2013-12-16

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  14. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  15. The Influence of Working Memory and Phonological Processing on English Language Learner Children's Bilingual Reading and Language Acquisition

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Orosco, Michael J.; Lussier, Cathy M.; Gerber, Michael M.; Guzman-Orth, Danielle A.

    2011-01-01

    In this study, we explored whether the contribution of working memory (WM) to children's (N = 471) 2nd language (L2) reading and language acquisition was best accounted for by processing efficiency at a phonological level and/or by executive processes independent of phonological processing. Elementary school children (Grades 1, 2, & 3) whose 1st…

  16. Relationships among process skills development, knowledge acquisition, and gender in microcomputer-based chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Krieger, Carla Repsher

    This study investigated how instruction in MBL environments can be designed to facilitate process skills development and knowledge acquisition among high school chemistry students. Ninety-eight college preparatory chemistry students in six intact classes were randomly assigned to one of three treatment groups: MBL with enhanced instruction in Macroscopic knowledge, MBL with enhanced instruction in Microscopic knowledge, and MBL with enhanced instruction in Symbolic knowledge. Each treatment group completed a total of four MBL titrations involving acids and bases. After the first and third titrations, the Macroscopic, Microscopic and Symbolic groups received enhanced instruction in the Macroscopic, Microscopic and Symbolic modes, respectively. During each titration, participants used audiotapes to record their verbal interactions. The study also explored the effects of three potential covariates (age, mathematics background, and computer usage) on the relationships among the independent variables (type of enhanced instruction and gender) and the dependent variables (science process skills and knowledge acquisition). Process skills were measured via gain scores on a standardized test. Analysis of Covariance eliminated age, mathematics background, and computer usage as covariates in this study. Analysis of Variance identified no significant effects on process skills attributable to treatment or gender. Knowledge acquisition was assessed via protocol analysis of statements made by the participants during the four titrations. Statements were categorized as procedural, observational, conceptual/analytical, or miscellaneous. Statement category percentages were analyzed for trends across treatments, genders, and experiments. Instruction emphasizing the Macroscopic mode may have increased percentages of observational and miscellaneous statements and decreased percentages of procedural and conceptual/analytical statements. Instruction emphasizing the Symbolic mode may have

  17. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  18. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  19. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  20. Relative efficiency of Gaussian stochastic process sampling procedures

    NASA Astrophysics Data System (ADS)

    Cameron, Chris

    2003-12-01

    Various methods for sampling stationary, Gaussian stochastic processes are investigated and compared with an emphasis on applications to processes with power law energy spectra. Several approaches are considered, including a Riemann summation using left endpoints, the use of random wave numbers to sample a the spectrum in proportion to the energy it contains, and a combination of the two. The Fourier-wavelet method of Elliott et al. is investigated and compared with other methods, all of which are evaluated in terms of their ability to sample the stochastic process over a large number of decades for a given computational cost. The Fourier-wavelet method has accuracy which increases linearly with the computational complexity, while the accuracy of the other methods grows logarithmically. For the Kolmogorov spectrum, a hybrid quadrature method is as efficient as the Fourier-wavelet method, if no more than eight decades of accuracy are required. The effectiveness of this hybrid method wanes when one samples fields whose energy spectrum decays more rapidly near the origin. The Fourier-wavelet method has roughly the same behavior independently of the exponent of the power law. The Fourier-wavelet method returns samples which are Gaussian over the range of values where the structure function is well approximated. By contrast, (multi-point) Gaussianity may be lost at the smaller length scales when one uses methods with random wave numbers.

  1. Cognitive processes during fear acquisition and extinction in animals and humans: implications for exposure therapy of anxiety disorders.

    PubMed

    Hofmann, Stefan G

    2008-02-01

    Anxiety disorders are highly prevalent. Fear conditioning and extinction learning in animals often serve as simple models of fear acquisition and exposure therapy of anxiety disorders in humans. This article reviews the empirical and theoretical literature on cognitive processes in fear acquisition, extinction, and exposure therapy. It is concluded that exposure therapy is a form of cognitive intervention that specifically changes the expectancy of harm. Implications for therapy research are discussed.

  2. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    PubMed

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples.

  3. Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples

    NASA Technical Reports Server (NTRS)

    Sunshine, Daniel

    2010-01-01

    The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.

  4. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    NASA Astrophysics Data System (ADS)

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, A. M.; Li, L.; Kao, C.-M.

    2015-06-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 T small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2×8 LYSO scintillators (5.0×5.0×10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner.

  5. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    PubMed Central

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, Alice M.; Li, Limin; Kao, C.-M.

    2014-01-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 Tesla small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2 × 8 LYSO scintillators (5.0 × 5.0 × 10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner. PMID:25937685

  6. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    SciTech Connect

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  7. Data acquisition and processing system of the electron cyclotron emission imaging system of the KSTAR tokamak

    SciTech Connect

    Kim, J. B.; Lee, W.; Yun, G. S.; Park, H. K.; Domier, C. W.; Luhmann, N. C. Jr.

    2010-10-15

    A new innovative electron cyclotron emission imaging (ECEI) diagnostic system for the Korean Superconducting Tokamak Advanced Research (KSTAR) produces a large amount of data. The design of the data acquisition and processing system of the ECEI diagnostic system should consider covering the large data production and flow. The system design is based on the layered structure scalable to the future extension to accommodate increasing data demands. Software architecture that allows a web-based monitoring of the operation status, remote experiment, and data analysis is discussed. The operating software will help machine operators and users validate the acquired data promptly, prepare next discharge, and enhance the experiment performance and data analysis in a distributed environment.

  8. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  9. Horizon Acquisition for Attitude Determination Using Image Processing Algorithms- Results of HORACE on REXUS 16

    NASA Astrophysics Data System (ADS)

    Barf, J.; Rapp, T.; Bergmann, M.; Geiger, S.; Scharf, A.; Wolz, F.

    2015-09-01

    The aim of the Horizon Acquisition Experiment (HORACE) was to prove a new concept for a two-axis horizon sensor using algorithms processing ordinary images, which is also operable at high spinning rates occurring during emergencies. The difficulty to cope with image distortions, which is avoided by conventional horizon sensors, was introduced on purpose as we envision a system being capable of using any optical data. During the flight on REXUS1 16, which provided a suitable platform similar to the future application scenario, a malfunction of the payload cameras caused severe degradation of the collected scientific data. Nevertheless, with the aid of simulations we could show that the concept is accurate (±0.6°), fast (~ lOOms/frame) and robust enough for coarse attitude determination during emergencies and also applicable for small satellites. Besides, technical knowledge regarding the design of REXUS-experiments, including the detection of interferences between SATA and GPS, was gained.

  10. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  11. Automation of sample plan creation for process model calibration

    NASA Astrophysics Data System (ADS)

    Oberschmidt, James; Abdo, Amr; Desouky, Tamer; Al-Imam, Mohamed; Krasnoperova, Azalia; Viswanathan, Ramya

    2010-04-01

    The process of preparing a sample plan for optical and resist model calibration has always been tedious. Not only because it is required to accurately represent full chip designs with countless combinations of widths, spaces and environments, but also because of the constraints imposed by metrology which may result in limiting the number of structures to be measured. Also, there are other limits on the types of these structures, and this is mainly due to the accuracy variation across different types of geometries. For instance, pitch measurements are normally more accurate than corner rounding. Thus, only certain geometrical shapes are mostly considered to create a sample plan. In addition, the time factor is becoming very crucial as we migrate from a technology node to another due to the increase in the number of development and production nodes, and the process is getting more complicated if process window aware models are to be developed in a reasonable time frame, thus there is a need for reliable methods to choose sample plans which also help reduce cycle time. In this context, an automated flow is proposed for sample plan creation. Once the illumination and film stack are defined, all the errors in the input data are fixed and sites are centered. Then, bad sites are excluded. Afterwards, the clean data are reduced based on geometrical resemblance. Also, an editable database of measurement-reliable and critical structures are provided, and their percentage in the final sample plan as well as the total number of 1D/2D samples can be predefined. It has the advantage of eliminating manual selection or filtering techniques, and it provides powerful tools for customizing the final plan, and the time needed to generate these plans is greatly reduced.

  12. How does the interaction between spelling and motor processes build up during writing acquisition?

    PubMed

    Kandel, Sonia; Perret, Cyril

    2015-03-01

    How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent.

  13. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  14. Survival-time statistics for sample space reducing stochastic processes

    NASA Astrophysics Data System (ADS)

    Yadav, Avinash Chand

    2016-04-01

    Stochastic processes wherein the size of the state space is changing as a function of time offer models for the emergence of scale-invariant features observed in complex systems. I consider such a sample-space reducing (SSR) stochastic process that results in a random sequence of strictly decreasing integers {x (t )},0 ≤t ≤τ , with boundary conditions x (0 )=N and x (τ ) = 1. This model is shown to be exactly solvable: PN(τ ) , the probability that the process survives for time τ is analytically evaluated. In the limit of large N , the asymptotic form of this probability distribution is Gaussian, with mean and variance both varying logarithmically with system size: <τ >˜lnN and στ2˜lnN . Correspondence can be made between survival-time statistics in the SSR process and record statistics of independent and identically distributed random variables.

  15. Properties of large-scale melt-processed YBCO samples

    NASA Astrophysics Data System (ADS)

    Gauss, S.; Elschner, S.; Bestgen, H.

    Magnetic bearings and superconducting permanent magnets are some of the first possible applications of bulk high Tc superconductors. Large samples were prepared by a new melt process starting from reacted YBCO 123 and 211 powders. The addition of PtO 2 to the mixture led to reduced 211 inclusion size and better homogeneity. Simultaneously the density of microcracks dividing the a- b basal plane was reduced. For testing the overall magnetic properties of these samples magnetization and levitation force measurements were performed. In comparison to samples without PtO 2 addition a strong increase in the magnetization M and the repulsion force from a magnet were observed. The maximum in the field dependence of M increased to more than 1000 G. According to the time dependence of the trapped field after a field cooling experiment an acceptable flux creep at 77 K for a long-term application was achieved.

  16. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  17. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  18. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  19. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  20. Large sample hydrology in NZ: Spatial organisation in process diagnostics

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Woods, R. A.; Clark, M. P.

    2013-12-01

    A key question in hydrology is how to predict the dominant runoff generation processes in any given catchment. This knowledge is vital for a range of applications in forecasting hydrological response and related processes such as nutrient and sediment transport. A step towards this goal is to map dominant processes in locations where data is available. In this presentation, we use data from 900 flow gauging stations and 680 rain gauges in New Zealand, to assess hydrological processes. These catchments range in character from rolling pasture, to alluvial plains, to temperate rainforest, to volcanic areas. By taking advantage of so many flow regimes, we harness the benefits of large-sample and comparative hydrology to study patterns and spatial organisation in runoff processes, and their relationship to physical catchment characteristics. The approach we use to assess hydrological processes is based on the concept of diagnostic signatures. Diagnostic signatures in hydrology are targeted analyses of measured data which allow us to investigate specific aspects of catchment response. We apply signatures which target the water balance, the flood response and the recession behaviour. We explore the organisation, similarity and diversity in hydrological processes across the New Zealand landscape, and how these patterns change with scale. We discuss our findings in the context of the strong hydro-climatic gradients in New Zealand, and consider the implications for hydrological model building on a national scale.

  1. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGES

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  2. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    SciTech Connect

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  3. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  4. The Earth Microbiome Project: Meeting report of the "1 EMP meeting on sample selection and acquisition" at Argonne National Laboratory October 6 2010.

    PubMed

    Gilbert, Jack A; Meyer, Folker; Jansson, Janet; Gordon, Jeff; Pace, Norman; Tiedje, James; Ley, Ruth; Fierer, Noah; Field, Dawn; Kyrpides, Nikos; Glöckner, Frank-Oliver; Klenk, Hans-Peter; Wommack, K Eric; Glass, Elizabeth; Docherty, Kathryn; Gallery, Rachel; Stevens, Rick; Knight, Rob

    2010-12-25

    This report details the outcome the first meeting of the Earth Microbiome Project to discuss sample selection and acquisition. The meeting, held at the Argonne National Laboratory on Wednesday October 6(th) 2010, focused on discussion of how to prioritize environmental samples for sequencing and metagenomic analysis as part of the global effort of the EMP to systematically determine the functional and phylogenetic diversity of microbial communities across the world.

  5. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  6. Processing of syllable stress is functionally different from phoneme processing and does not profit from literacy acquisition.

    PubMed

    Schild, Ulrike; Becker, Angelika B C; Friedrich, Claudia K

    2014-01-01

    Speech is characterized by phonemes and prosody. Neurocognitive evidence supports the separate processing of each type of information. Therefore, one might suggest individual development of both pathways. In this study, we examine literacy acquisition in middle childhood. Children become aware of the phonemes in speech at that time and refine phoneme processing when they acquire an alphabetic writing system. We test whether an enhanced sensitivity to phonemes in middle childhood extends to other aspects of the speech signal, such as prosody. To investigate prosodic processing, we used stress priming. Spoken stressed and unstressed syllables (primes) preceded spoken German words with stress on the first syllable (targets). We orthogonally varied stress overlap and phoneme overlap between the primes and onsets of the targets. Lexical decisions and Event-Related Potentials (ERPs) for the targets were obtained for pre-reading preschoolers, reading pupils and adults. The behavioral and ERP results were largely comparable across all groups. The fastest responses were observed when the first syllable of the target word shared stress and phonemes with the preceding prime. ERP stress priming and ERP phoneme priming started 200 ms after the target word onset. Bilateral ERP stress priming was characterized by enhanced ERP amplitudes for stress overlap. Left-lateralized ERP phoneme priming replicates previously observed reduced ERP amplitudes for phoneme overlap. Groups differed in the strength of the behavioral phoneme priming and in the late ERP phoneme priming effect. The present results show that enhanced phonological processing in middle childhood is restricted to phonemes and does not extend to prosody. These results are indicative of two parallel processing systems for phonemes and prosody that might follow different developmental trajectories in middle childhood as a function of alphabetic literacy. PMID:24917838

  7. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  8. Experiment kits for processing biological samples inflight on SLS-2

    NASA Technical Reports Server (NTRS)

    Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.

    1995-01-01

    This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.

  9. Developmental trends in auditory processing can provide early predictions of language acquisition in young infants.

    PubMed

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R; Shao, Jie; Lozoff, Betsy

    2013-03-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory - Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  10. Adaptive Sampling for Learning Gaussian Processes Using Mobile Sensor Networks

    PubMed Central

    Xu, Yunfei; Choi, Jongeun

    2011-01-01

    This paper presents a novel class of self-organizing sensing agents that adaptively learn an anisotropic, spatio-temporal Gaussian process using noisy measurements and move in order to improve the quality of the estimated covariance function. This approach is based on a class of anisotropic covariance functions of Gaussian processes introduced to model a broad range of spatio-temporal physical phenomena. The covariance function is assumed to be unknown a priori. Hence, it is estimated by the maximum a posteriori probability (MAP) estimator. The prediction of the field of interest is then obtained based on the MAP estimate of the covariance function. An optimal sampling strategy is proposed to minimize the information-theoretic cost function of the Fisher Information Matrix. Simulation results demonstrate the effectiveness and the adaptability of the proposed scheme. PMID:22163785

  11. An underground tale: contribution of microbial activity to plant iron acquisition via ecological processes

    PubMed Central

    Jin, Chong Wei; Ye, Yi Quan; Zheng, Shao Jian

    2014-01-01

    Background Iron (Fe) deficiency in crops is a worldwide agricultural problem. Plants have evolved several strategies to enhance Fe acquisition, but increasing evidence has shown that the intrinsic plant-based strategies alone are insufficient to avoid Fe deficiency in Fe-limited soils. Soil micro-organisms also play a critical role in plant Fe acquisition; however, the mechanisms behind their promotion of Fe acquisition remain largely unknown. Scope This review focuses on the possible mechanisms underlying the promotion of plant Fe acquisition by soil micro-organisms. Conclusions Fe-deficiency-induced root exudates alter the microbial community in the rhizosphere by modifying the physicochemical properties of soil, and/or by their antimicrobial and/or growth-promoting effects. The altered microbial community may in turn benefit plant Fe acquisition via production of siderophores and protons, both of which improve Fe bioavailability in soil, and via hormone generation that triggers the enhancement of Fe uptake capacity in plants. In addition, symbiotic interactions between micro-organisms and host plants could also enhance plant Fe acquisition, possibly including: rhizobium nodulation enhancing plant Fe uptake capacity and mycorrhizal fungal infection enhancing root length and the nutrient acquisition area of the root system, as well as increasing the production of Fe3+ chelators and protons. PMID:24265348

  12. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  13. The Effects of Word Exposure Frequency and Elaboration of Word Processing on Incidental L2 Vocabulary Acquisition through Reading

    ERIC Educational Resources Information Center

    Eckerth, Johannes; Tavakoli, Parveneh

    2012-01-01

    Research on incidental second language (L2) vocabulary acquisition through reading has claimed that repeated encounters with unfamiliar words and the relative elaboration of processing these words facilitate word learning. However, so far both variables have been investigated in isolation. To help close this research gap, the current study…

  14. The Comparative Effects of Processing Instruction and Dictogloss on the Acquisition of the English Passive by Speakers of Turkish

    ERIC Educational Resources Information Center

    Uludag, Onur; Vanpatten, Bill

    2012-01-01

    The current study presents the results of an experiment investigating the effects of processing instruction (PI) and dictogloss (DG) on the acquisition of the English passive voice. Sixty speakers of Turkish studying English at university level were assigned to three groups: one receiving PI, the other receiving DG and the third serving as a…

  15. Combining Contextual and Morphemic Cues Is Beneficial during Incidental Vocabulary Acquisition: Semantic Transparency in Novel Compound Word Processing

    ERIC Educational Resources Information Center

    Brusnighan, Stephen M.; Folk, Jocelyn R.

    2012-01-01

    In two studies, we investigated how skilled readers use contextual and morphemic information in the process of incidental vocabulary acquisition during reading. In Experiment 1, we monitored skilled readers' eye movements while they silently read sentence pairs containing novel and known English compound words that were either semantically…

  16. The Acceleration of Spoken-Word Processing in Children's Native-Language Acquisition: An ERP Cohort Study

    ERIC Educational Resources Information Center

    Ojima, Shiro; Matsuba-Kurita, Hiroko; Nakamura, Naoko; Hagiwara, Hiroko

    2011-01-01

    Healthy adults can identify spoken words at a remarkable speed, by incrementally analyzing word-onset information. It is currently unknown how this adult-level speed of spoken-word processing emerges during children's native-language acquisition. In a picture-word mismatch paradigm, we manipulated the semantic congruency between picture contexts…

  17. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  18. Micro-MRI-based image acquisition and processing system for assessing the response to therapeutic intervention

    NASA Astrophysics Data System (ADS)

    Vasilić, B.; Ladinsky, G. A.; Saha, P. K.; Wehrli, F. W.

    2006-03-01

    Osteoporosis is the cause of over 1.5 million bone fractures annually. Most of these fractures occur in sites rich in trabecular bone, a complex network of bony struts and plates found throughout the skeleton. The three-dimensional structure of the trabecular bone network significantly determines mechanical strength and thus fracture resistance. Here we present a data acquisition and processing system that allows efficient noninvasive assessment of trabecular bone structure through a "virtual bone biopsy". High-resolution MR images are acquired from which the trabecular bone network is extracted by estimating the partial bone occupancy of each voxel. A heuristic voxel subdivision increases the effective resolution of the bone volume fraction map and serves a basis for subsequent analysis of topological and orientational parameters. Semi-automated registration and segmentation ensure selection of the same anatomical location in subjects imaged at different time points during treatment. It is shown with excerpts from an ongoing clinical study of early post-menopausal women, that significant reduction in network connectivity occurs in the control group while the structural integrity is maintained in the hormone replacement group. The system described should be suited for large-scale studies designed to evaluate the efficacy of therapeutic intervention in subjects with metabolic bone disease.

  19. Meteoceanographic premises for structural design purposes in the Adriatic Sea: Acquisition and processing of data

    SciTech Connect

    Rampolli, M.; Biancardi, A.; Filippi, G. De

    1996-12-31

    In 1993 the leading international standards (ISO, APOI RP2A) for the design of offshore structures drastically changed the procedure for the definition of hydrodynamic forces. In particular oil companies are required to have a detailed knowledge of the weather of the areas where they operate, if they want to maintain the previous results. Alternatively, more conservative hydrodynamic forces must be considered in the design phase. Such an increase, valuable in 20--30% of total hydrodynamic force, means heavier platform structures in new projects, and more critical elements to be inspected in existing platforms. In 1992, in order to have more reliable and safe transports to and from the platforms, Agip installed a meteo-marine sensor network in Adriatic Sea, on 13 of the over 80 producing platforms. Data collected are sent to shore via radio and operators can use real time data or 12-hour wave forecast, obtained by a statistic forecasting model. Taking advantage by such existing instruments, a project was undertaken in 1993 with the purpose of determining the extreme environmental parameters to be used by structural engineers. The network has been upgraded in order to achieve directional information of the waves and to permit short term analysis. This paper describes the data acquisition system, data processing and the achieved results.

  20. Immunological processes underlying the slow acquisition of humoral immunity to malaria.

    PubMed

    Ryg-Cornejo, Victoria; Ly, Ann; Hansen, Diana S

    2016-02-01

    Malaria is one of the most serious infectious diseases with ~250 million clinical cases annually. Most cases of severe disease are caused by Plasmodium falciparum. The blood stage of Plasmodium parasite is entirely responsible for malaria-associated pathology. Disease syndromes range from fever to more severe complications, including respiratory distress, metabolic acidosis, renal failure, pulmonary oedema and cerebral malaria. The most susceptible population to severe malaria is children under the age of 5, with low levels of immunity. It is only after many years of repeated exposure, that individuals living in endemic areas develop clinical immunity. This form of protection does not result in sterilizing immunity but prevents clinical episodes by substantially reducing parasite burden. Naturally acquired immunity predominantly targets blood-stage parasites and it is known to require antibody responses. A large body of epidemiological evidence suggests that antibodies to Plasmodium antigens are inefficiently generated and rapidly lost in the absence of ongoing exposure, which suggests a defect in the development of B cell immunological memory. This review summarizes the main findings to date contributing to our understanding on cellular processes underlying the slow acquisition of humoral immunity to malaria. Some of the key outstanding questions in the field are discussed.

  1. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary. PMID:26930464

  2. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary.

  3. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    SciTech Connect

    Alai, M; Glascoe, L; Love, A; Johnson, M; Einfeld, W

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) was to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.

  4. Parameter identification of process simulation models as a means for knowledge acquisition and technology transfer

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Ifanti, Konstantina

    2012-12-01

    Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.

  5. The OSIRIS-REx Mission Sample Site Selection Process

    NASA Astrophysics Data System (ADS)

    Beshore, Edward C.; Lauretta, Dante

    2014-11-01

    In September of 2016, the OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, Security, REgolith eXplorer) spacecraft will depart for asteroid (101955) Bennu, and in doing so, will turn an important corner in the exploration of the solar system. After arriving at Bennu in the fall of 2018, OSIRIS-REx will undertake a program of observations designed to select a site suitable for retrieving a sample that will be returned to the Earth in 2023. The third mission in NASA’s New Frontiers program, OSIRIS-REx will return over 60 grams from Bennu’s surface.OSIRIS-REx is unique because the science team will have an operational role to play in preparing data products needed to select a sample site. These include products used to ensure flight system safety — topographic maps and shape models, temperature measurements, maps of hazards — as well as assessments of sampleability and science value. The timing and production of these will be presented, as will the high-level decision-making tools and processes for the interim and final site selection processes.

  6. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA

  7. Wearable system for acquisition, processing and storage of the signal from amperometric glucose sensors.

    PubMed

    Fabietti, P G; Massi Benedetti, M; Bronzo, F; Reboldi, G P; Sarti, E; Brunetti, P

    1991-03-01

    A wearable device for the acquisition, processing and storage of the signal from needle-type glucose sensors has been designed and developed as part of a project aimed at developing a portable artificial pancreas. The device is essential to assess the operational characteristics of miniaturized sensors in vivo. It can be connected to sensors operating at a constant potential of 0.65 Volts, and generating currents in the order of 10(-9) Amp. It is screened and equipped with filters that permit data recording and processing even in the presence of electrical noise. It can operate with sensors with different characteristics (1-200 nA full scale). The device has been designed to be worn by patients, so its weight and size have been kept to a minimum (250 g; 8.5 x 14.5 x 3.5 cm). It is powered by rechargeable Ni/Cd batteries allowing continuous operation for 72 h. The electronics consists of an analog card with operational amplifiers, and a digital one with a microprocessor (Intel 80C196, MCS-96 class, with internal 16-bit CPU supporting programs written in either C or Assembler language), a 32 Kb EPROM, and an 8 Kb RAM where the data are stored. The microprocessor can run either at 5 or 10 Mhz and features on-chip peripherals: an analog/digital (A/D) converter, a serial port (used to transfer data to a Personal Computer at the end of the 72 h), input-output (I/O) units at high-speed, and two timers. The device is programmed and prepared to operate by means of a second hand-held unit equipped with an LCD display and a 16-key numeric pad.(ABSTRACT TRUNCATED AT 250 WORDS)

  8. The influence of the microscope lamp filament colour temperature on the process of digital images of histological slides acquisition standardization

    PubMed Central

    2014-01-01

    Background The aim of this study is to compare the digital images of the tissue biopsy captured with optical microscope using bright field technique under various light conditions. The range of colour's variation in immunohistochemically stained with 3,3'-Diaminobenzidine and Haematoxylin tissue samples is immense and coming from various sources. One of them is inadequate setting of camera's white balance to microscope's light colour temperature. Although this type of error can be easily handled during the stage of image acquisition, it can be eliminated with use of colour adjustment algorithms. The examination of the dependence of colour variation from microscope's light temperature and settings of the camera is done as an introductory research to the process of automatic colour standardization. Methods Six fields of view with empty space among the tissue samples have been selected for analysis. Each field of view has been acquired 225 times with various microscope light temperature and camera white balance settings. The fourteen randomly chosen images have been corrected and compared, with the reference image, by the following methods: Mean Square Error, Structural SIMilarity and visual assessment of viewer. Results For two types of backgrounds and two types of objects, the statistical image descriptors: range, median, mean and its standard deviation of chromaticity on a and b channels from CIELab colour space, and luminance L, and local colour variability for objects' specific area have been calculated. The results have been averaged for 6 images acquired in the same light conditions and camera settings for each sample. Conclusions The analysis of the results leads to the following conclusions: (1) the images collected with white balance setting adjusted to light colour temperature clusters in certain area of chromatic space, (2) the process of white balance correction for images collected with white balance camera settings not matched to the light temperature

  9. Fast acquisition of high resolution 4-D amide-amide NOESY with diagonal suppression, sparse sampling and FFT-CLEAN.

    PubMed

    Werner-Allen, Jon W; Coggins, Brian E; Zhou, Pei

    2010-05-01

    Amide-amide NOESY provides important distance constraints for calculating global folds of large proteins, especially integral membrane proteins with beta-barrel folds. Here, we describe a diagonal-suppressed 4-D NH-NH TROSY-NOESY-TROSY (ds-TNT) experiment for NMR studies of large proteins. The ds-TNT experiment employs a spin state selective transfer scheme that suppresses diagonal signals while providing TROSY optimization in all four dimensions. Active suppression of the strong diagonal peaks greatly reduces the dynamic range of observable signals, making this experiment particularly suitable for use with sparse sampling techniques. To demonstrate the utility of this method, we collected a high resolution 4-D ds-TNT spectrum of a 23kDa protein using randomized concentric shell sampling (RCSS), and we used FFT-CLEAN processing for further reduction of aliasing artifacts - the first application of these techniques to a NOESY experiment. A comparison of peak parameters in the high resolution 4-D dataset with those from a conventionally-sampled 3-D control spectrum shows an accurate reproduction of NOE crosspeaks in addition to a significant reduction in resonance overlap, which largely eliminates assignment ambiguity. Likewise, a comparison of 4-D peak intensities and volumes before and after application of the CLEAN procedure demonstrates that the reduction of aliasing artifacts by CLEAN does not systematically distort NMR signals.

  10. SAMPLING DEVICE FOR pH MEASUREMENT IN PROCESS STREAMS

    DOEpatents

    Michelson, C.E.; Carson, W.N. Jr.

    1958-11-01

    A pH cell is presented for monitoring the hydrogen ion concentration of a fluid in a process stream. The cell is made of glass with a side entry arm just above a reservoir in which the ends of a glass electrode and a reference electrode are situated. The glass electrode contains the usual internal solution which is connected to a lead. The reference electrode is formed of saturated calomel having a salt bridge in its bottom portion fabricated of a porous glass to insure low electrolyte flow. A flush tube leads into the cell through which buffer and flush solutions are introduced. A ground wire twists about both electrode ends to insure constant electrical grounding of the sample. The electrode leads are electrically connected to a pH meter of aay standard type.

  11. Proceedings of the XIIIth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition, and Processing

    USGS Publications Warehouse

    Love, Jeffrey J.

    2009-01-01

    The thirteenth biennial International Association of Geomagnetism and Aeronomy (IAGA) Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing was held in the United States for the first time on June 9-18, 2008. Hosted by the U.S. Geological Survey's (USGS) Geomagnetism Program, the workshop's measurement session was held at the Boulder Observatory and the scientific session was held on the campus of the Colorado School of Mines in Golden, Colorado. More than 100 participants came from 36 countries and 6 continents. Preparation for the workshop began when the USGS Geomagnetism Program agreed, at the close of the twelfth workshop in Belsk Poland in 2006, to host the next workshop. Working under the leadership of Alan Berarducci, who served as the chairman of the local organizing committee, and Tim White, who served as co-chairman, preparations began in 2007. The Boulder Observatory was extensively renovated and additional observation piers were installed. Meeting space on the Colorado School of Mines campus was arranged, and considerable planning was devoted to managing the many large and small issues that accompany an international meeting. Without the devoted efforts of both Alan and Tim, other Geomagnetism Program staff, and our partners at the Colorado School of Mines, the workshop simply would not have occurred. We express our thanks to Jill McCarthy, the USGS Central Region Geologic Hazards Team Chief Scientist; Carol A. Finn, the Group Leader of the USGS Geomagnetism Program; the USGS International Office; and Melody Francisco of the Office of Special Programs and Continuing Education of the Colorado School of Mines. We also thank the student employees that the Geomagnetism Program has had over the years and leading up to the time of the workshop. For preparation of the proceedings, thanks go to Eddie and Tim. And, finally, we thank our sponsors, the USGS, IAGA, and the Colorado School of Mines.

  12. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  13. Microfabricated Devices for Sample Extraction, Concentrations, and Related Sample Processing Technologies

    SciTech Connect

    Chen, Gang; Lin, Yuehe

    2006-12-01

    This is an invited book chapter. As with other analytical techniques, sample pretreatments, sample extraction, sample introduction, and related techniques are of extreme importance for micro-electro-mechanical systems (MEMS). Bio-MEMS devices and systems start with a sampling step. The biological sample then usually undergoes some kinds of sample preparation steps before the actual analysis. These steps may involve extracting the target sample from its matrix, removing interferences from the sample, derivatizing the sample to detectable species, or performing a sample preconcentration step. The integration of the components for sample pretreatment into microfluidic devices represents one of the remaining the bottle-neck towards achieving true miniaturized total analysis systems (?TAS). This chapter provides a thorough state-of-art of the developments in this field to date.

  14. Sampling Designs in Qualitative Research: Making the Sampling Process More Public

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.

    2007-01-01

    The purpose of this paper is to provide a typology of sampling designs for qualitative researchers. We introduce the following sampling strategies: (a) parallel sampling designs, which represent a body of sampling strategies that facilitate credible comparisons of two or more different subgroups that are extracted from the same levels of study;…

  15. Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.

    PubMed

    Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

    2012-11-01

    This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others.

  16. The history of imitation in learning theory: the language acquisition process.

    PubMed

    Kymissis, E; Poulson, C L

    1990-09-01

    The concept of imitation has undergone different analyses in the hands of different learning theorists throughout the history of psychology. From Thorndike's connectionism to Pavlov's classical conditioning, Hull's monistic theory, Mowrer's two-factor theory, and Skinner's operant theory, there have been several divergent accounts of the conditions that produce imitation and the conditions under which imitation itself may facilitate language acquisition. In tracing the roots of the concept of imitation in the history of learning theory, the authors conclude that generalized imitation, as defined and analyzed by operant learning theorists, is a sufficiently robust formulation of learned imitation to facilitate a behavior-analytic account of first-language acquisition.

  17. Recent Results of the Investigation of a Microfluidic Sampling Chip and Sampling System for Hot Cell Aqueous Processing Streams

    SciTech Connect

    Julia Tripp; Jack Law; Tara Smith

    2013-10-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and microfluidics sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The microfluidic-based robotic sampling system’s mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of microfluidic sampling chips.

  18. Optionality in Second Language Acquisition: A Generative, Processing-Oriented Account

    ERIC Educational Resources Information Center

    Truscott, John

    2006-01-01

    The simultaneous presence in a learner's grammar of two features that should be mutually exclusive (optionality) typifies second language acquisition. But generative approaches have no good means of accommodating the phenomenon. The paper proposes one approach, based on Truscott and Sharwood Smith's (2004) MOGUL framework. In this framework,…

  19. A Tale of Two Career Paths: The Process of Status Acquisition by a New Organizational Unit.

    ERIC Educational Resources Information Center

    Briody, Elizabeth K.; And Others

    1995-01-01

    Interviews with 39 sales/service employees of General Motors' new Telemarketing Assistance Group identified factors influencing the acquisition of status in organizations: reorganization, managerial decision making, employee interpretations and reactions, and community consensus. The status of organizational units was related to career mobility.…

  20. The Representation and Processing of Familiar Faces in Dyslexia: Differences in Age of Acquisition Effects

    ERIC Educational Resources Information Center

    Smith-Spark, James H.; Moore, Viv

    2009-01-01

    Two under-explored areas of developmental dyslexia research, face naming and age of acquisition (AoA), were investigated. Eighteen dyslexic and 18 non-dyslexic university students named the faces of 50 well-known celebrities, matched for facial distinctiveness and familiarity. Twenty-five of the famous people were learned early in life, while the…

  1. Directed Blogging with Community College ESL Students: Its Effects on Awareness of Language Acquisition Processes

    ERIC Educational Resources Information Center

    Johnson, Cathy

    2012-01-01

    English as a Second Language (ESL) students often have problems progressing in their acquisition of the language and frequently do not know how to solve this dilemma. Many of them think of their second language studies as just another school subject that they must pass in order to move on to the next level, so few of them realize the metacognitive…

  2. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  3. The application of a new sampling theorem for non-bandlimited signals on the sphere: Improving the recovery of crossing fibers for low b-value acquisitions.

    PubMed

    Deslauriers-Gauthier, Samuel; Marziliano, Pina; Paquette, Michael; Descoteaux, Maxime

    2016-05-01

    Recent development in sampling theory now allows the sampling and reconstruction of certain non-bandlimited functions on the sphere, namely a sum of weighted Diracs. Because the signal acquired in diffusion Magnetic Resonance Imaging (dMRI) can be modeled as the convolution between a sampling kernel and two dimensional Diracs defined on the sphere, these advances have great potential in dMRI. In this work, we introduce a local reconstruction method for dMRI based on a new sampling theorem for non-bandlimited signals on the sphere. This new algorithm, named Spherical Finite Rate of Innovation (SFRI), is able to recover fibers crossing at very narrow angles with little dependence on the b-value. Because of its parametric formulation, SFRI can distinguish crossing fibers even when using a DTI-like acquisition (≈32 directions). This opens new perspective for low b-value and low number of gradient directions diffusion acquisitions and tractography studies. We evaluate the angular resolution of SFRI using state of the art synthetic data and compare its performance using in-vivo data. Our results show that, at low b-values, SFRI recovers crossing fibers not identified by constrained spherical deconvolution. We also show that low b-value results obtained using SFRI are similar to those obtained with constrained spherical deconvolution at a higher b-value.

  4. Rapid parameter optimization of low signal-to-noise samples in NMR spectroscopy using rapid CPMG pulsing during acquisition: application to recycle delays.

    PubMed

    Farooq, Hashim; Courtier-Murias, Denis; Soong, Ronald; Masoom, Hussain; Maas, Werner; Fey, Michael; Kumar, Rajeev; Monette, Martine; Stronks, Henry; Simpson, Myrna J; Simpson, André J

    2013-03-01

    A method is presented that combines Carr-Purcell-Meiboom-Gill (CPMG) during acquisition with either selective or nonselective excitation to produce a considerable intensity enhancement and a simultaneous loss in chemical shift information. A range of parameters can theoretically be optimized very rapidly on the basis of the signal from the entire sample (hard excitation) or spectral subregion (soft excitation) and should prove useful for biological, environmental, and polymer samples that often exhibit highly dispersed and broad spectral profiles. To demonstrate the concept, we focus on the application of our method to T(1) determination, specifically for the slowest relaxing components in a sample, which ultimately determines the optimal recycle delay in quantitative NMR. The traditional inversion recovery (IR) pulse program is combined with a CPMG sequence during acquisition. The slowest relaxing components are selected with a shaped pulse, and then, low-power CPMG echoes are applied during acquisition with intervals shorter than chemical shift evolution (RCPMG) thus producing a single peak with an SNR commensurate with the sum of the signal integrals in the selected region. A traditional (13)C IR experiment is compared with the selective (13)C IR-RCPMG sequence and yields the same T(1) values for samples of lysozyme and riverine dissolved organic matter within error. For lysozyme, the RCPMG approach is ~70 times faster, and in the case of dissolved organic matter is over 600 times faster. This approach can be adapted for the optimization of a host of parameters where chemical shift information is not necessary, such as cross-polarization/mixing times and pulse lengths.

  5. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  6. DIADEM--a system for the interactive data acquisition and processing in an analytical laboratory.

    PubMed

    Peters, F; Teschner, W

    1979-09-01

    A conversational program for the acquisition of experimental data in a multi-user, multi-instrument computer system is described. It assists the researcher when recording on-time data. Due to the simple structure of the dialogue, no special knowledge of computer handling is required by the experimenter. Whereas the experimental methods are versatile, a uniform concept of the dialogue and the file structure is realized. PMID:487779

  7. The health hazard assessment process in support of joint weapon system acquisitions.

    PubMed

    Kluchinsky, Timothy A; Jokel, Charles R; Cambre, John V; Goddard, Donald E; Batts, Robert W

    2013-01-01

    Since 1981, the Army's HHA Program has provided an invaluable service to combat developers and materiel program managers by providing recommendations designed to eliminate or control health hazards associated with materiel and weapon systems. The program has consistently strived to improve its services by providing more meaningful and efficient assistance to the acquisition community. In the uncertain fiscal times ahead, the Army's HHA Program will continue to provide valuable and cost-effective solutions to mitigate the health risks of weapons systems.

  8. Hippocampal Context Processing during Acquisition of a Predictive Learning Task Is Associated with Renewal in Extinction Recall.

    PubMed

    Lissek, Silke; Glaubitz, Benjamin; Schmidt-Wilcke, Tobias; Tegenthoff, Martin

    2016-05-01

    Renewal is defined as the recovery of an extinguished response if extinction and retrieval contexts differ. The context dependency of extinction, as demonstrated by renewal, has important implications for extinction-based therapies. Persons showing renewal (REN) exhibit higher hippocampal activation during extinction in associative learning than those without renewal (NOREN), demonstrating hippocampal context processing, and recruit ventromedial pFC in retrieval. Apart from these findings, brain processes generating renewal remain largely unknown. Conceivably, processing differences in task-relevant brain regions that ultimately lead to renewal may occur already in initial acquisition of associations. Therefore, in two fMRI studies, we investigated overall brain activation and hippocampal activation in REN and NOREN during acquisition of an associative learning task in response to presentation of a context alone or combined with a cue. Results of two studies demonstrated significant activation differences between the groups: In Study 1, a support vector machine classifier correctly assigned participants' brain activation patterns to REN and NOREN groups, respectively. In Study 2, REN and NOREN showed similar hippocampal involvement during context-only presentation, suggesting processing of novelty, whereas overall hippocampal activation to the context-cue compound, suggesting compound encoding, was higher in REN. Positive correlations between hippocampal activation and renewal level indicated more prominent hippocampal processing in REN. Results suggest that hippocampal processing of the context-cue compound rather than of context only during initial learning is related to a subsequent renewal effect. Presumably, REN participants use distinct encoding strategies during acquisition of context-related tasks, which reflect in their brain activation patterns and contribute to a renewal effect. PMID:26807840

  9. Optimization of enrichment processes of pentachlorophenol (PCP) from water samples.

    PubMed

    Li, Ping; Liu, Jun-xin

    2004-01-01

    The method of enriching PCP(pentachlorophenol) from aquatic environment by solid phase extraction(SPE) was studied. Several factors affecting the recoveries of PCP, including sample pH, eluting solvent, eluting volume and flow rate of water sample, were optimized by orthogonal array design(OAD). The optimized results were sample pH 4; eluting solvent, 100% methanol; eluting solvent volume, 2 ml and flow rate of water sample, 4 ml/min. A comparison is made between SPE and liquid-liquid extraction(LLE) method. The recoveries of PCP were in the range of 87.6%-133.6% and 79%-120.3% for SPE and LLE, respectively. Important advantages of the SPE compared with the LLE include the short extraction time and reduced consumption of organic solvents. SPE can replace LLE for isolating and concentrating PCP from water samples.

  10. Onboard infrared signal processing system for asteroid sample return mission HAYABUSA2

    NASA Astrophysics Data System (ADS)

    Otake, Hisashi; Okada, Tatsuaki; Funase, Ryu; Hihara, Hiroki; Sano, Junpei; Iwase, Kaori; Kawakami, Satoko; Takada, Jun; Masuda, Tetsuya

    2014-09-01

    Onboard signal processing system for infrared sensors has been developed for HAYABUSA2 for the exploration of C class near-Earth asteroid 162173 (1999JU3), which is planned to be launched in 2014. An optical navigation camera with telephoto lens (ONC-T), a thermal-infrared imager (TIR), and a near infrared spectrometer (NIRS3) have been developed for the observation of geology, thermo-physical properties, and organic or hydrated materials on the asteroid. ONC-T and TIR are used for those scientific purposes as well as assessment of landing site selection and safe descent operation onto the asteroid surface for sample acquisition. NIRS3 is used to characterize the mineralogy of the asteroid surface by observing the 3-micron band, where the particular diagnostic absorption features due to hydrated minerals appear. Since the processing cycle of these sensors are independent, data processing, formatting and recording are processed in parallel. In order to provide the functions within the resource limitation of deep space mission, automatic packet routing function is realized in one chip router with SpaceWire standard. Thanks to the SpaceWire upper layer protocol (remote memory access protocol: RMAP), the variable length file system operation function can be delegated to the data recorder from the CPU module of the digital electronics of the sensor system. In consequence the infrared spectrometer data from NIRS3 is recorded in parallel with the infrared image sensors. High speed image compression algorithm is also developed for both lossless and lossy image compression in order to eliminate additional hardware resource while maintaining the JPEG2000 equivalent image quality.

  11. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 4 2014-04-01 2014-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  12. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  13. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 4 2012-04-01 2012-04-01 false Sampling and testing of in-process materials and... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... appropriate samples of in-process materials of each batch. Such control procedures shall be established...

  14. Mechanical Abrasion as a Low Cost Technique for Contamination-Free Sample Acquisition from a Category IVA Clean Platform

    NASA Technical Reports Server (NTRS)

    Dolgin, B.; Yarbrough, C.; Carson, J.; Troy, R.

    2000-01-01

    The proposed Mars Sample Transfer Chain Architecture provides Planetary Protection Officers with clean samples that are required for the eventual release from confinement of the returned Martian samples. At the same time, absolute cleanliness and sterility requirement is not placed of any part of the Lander (including the deep drill), Mars Assent Vehicle (MAV), any part of the Orbiting Sample container (OS), Rover mobility platform, any part of the Minicorer, Robotic arm (including instrument sensors), and most of the caching equipment on the Rover. The removal of the strict requirements in excess of the Category IVa cleanliness (Pathfinder clean) is expected to lead to significant cost savings. The proposed architecture assumes that crosscontamination renders all surfaces in the vicinity of the rover(s) and the lander(s) contaminated. Thus, no accessible surface of Martian rocks and soil is Earth contamination free. As a result of the latter, only subsurface samples (either rock or soil) can be and will be collected for eventual return to Earth. Uncontaminated samples can be collected from a Category IVa clean platform. Both subsurface soil and rock samples can be maintained clean if they are collected by devices that are self-contained and clean and sterile inside only. The top layer of the sample is removed in a manner that does not contaminate the collection tools. Biobarrier (e.g., aluminum foil) covering the moving parts of these devices may be used as the only self removing bio-blanket that is required. The samples never leave the collection tools. The lids are placed on these tools inside the collection device. These single use tools with the lid and the sample inside are brought to Earth in the OS. The lids have to be designed impenetrable to the Earth organisms. The latter is a well established art.

  15. Fast nearly ML estimation of Doppler frequency in GNSS signal acquisition process.

    PubMed

    Tang, Xinhua; Falletti, Emanuela; Lo Presti, Letizia

    2013-04-29

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains.

  16. Fast Nearly ML Estimation of Doppler Frequency in GNSS Signal Acquisition Process

    PubMed Central

    Tang, Xinhua; Falletti, Emanuela; Presti, Letizia Lo

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  17. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  18. Isolation process of industrially useful Clostridium bifermentans from natural samples.

    PubMed

    Myszka, Kamila; Leja, Katarzyna; Olejnik-Schmidt, Agnieszka K; Czaczyk, Katarzyna

    2012-05-01

    A selective isolation procedure of clostridial strains from natural samples able to convert glycerol to 1,3-propanediol (1,3-PD) and organic acids was investigated. The modified PY medium of high concentration of NaHCO(3) was shown to be highly selective for Clostridium bifermentans. Obtained isolates produced mainly 1,3-PD, lactic, acetic, and formic acids from glycerol.

  19. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  20. Data acquisition techniques for exploiting the uniqueness of the time-of-flight mass spectrometer: Application to sampling pulsed gas systems

    NASA Technical Reports Server (NTRS)

    Lincoln, K. A.

    1980-01-01

    Mass spectra are produced in most mass spectrometers by sweeping some parameter within the instrument as the sampled gases flow into the ion source. It is evident that any fluctuation in the gas during the sweep (mass scan) of the instrument causes the output spectrum to be skewed in its mass peak intensities. The time of flight mass spectrometer (TOFMS) with its fast, repetitive mode of operation produces spectra without skewing or varying instrument parameters and because all ion species are ejected from the ion source simultaneously, the spectra are inherently not skewed despite rapidly changing gas pressure or composition in the source. Methods of exploiting this feature by utilizing fast digital data acquisition systems, such as transient recorders and signal averagers which are commercially available are described. Applications of this technique are presented including TOFMS sampling of vapors produced by both pulsed and continuous laser heating of materials.

  1. THE BLANCO COSMOLOGY SURVEY: DATA ACQUISITION, PROCESSING, CALIBRATION, QUALITY DIAGNOSTICS, AND DATA RELEASE

    SciTech Connect

    Desai, S.; Mohr, J. J.; Semler, D. R.; Liu, J.; Bazin, G.; Zenteno, A.; Armstrong, R.; Bertin, E.; Allam, S. S.; Buckley-Geer, E. J.; Lin, H.; Tucker, D.; Barkhouse, W. A.; Cooper, M. C.; Hansen, S. M.; High, F. W.; Lin, Y.-T.; Ngeow, C.-C.; Rest, A.; Song, J.

    2012-09-20

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha}, {delta}) = (5 hr, -55 Degree-Sign ) and (23 hr, -55 Degree-Sign ). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4 m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out point-spread function-corrected model-fitting photometry for all detected objects. The median 10{sigma} galaxy (point-source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6), and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 mas. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from the Two Micron All Sky Survey, which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematic floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7%, and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier spread{sub m}odel produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta}z/(1 + z) = 0.054 with an outlier fraction {eta} < 5% to z {approx} 1. We highlight some selected science results to date and provide a full description of the released data products.

  2. The Blanco Cosmology Survey: Data Acquisition, Processing, Calibration, Quality Diagnostics and Data Release

    SciTech Connect

    Desai, S.; Armstrong, R.; Mohr, J.J.; Semler, D.R.; Liu, J.; Bertin, E.; Allam, S.S.; Barkhouse, W.A.; Bazin, G.; Buckley-Geer, E.J.; Cooper, M.C.; /UC, Irvine /Lick Observ. /UC, Santa Cruz

    2012-04-01

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha},{delta})= (5 hr, -55{sup circ} and 23 hr, -55{sup circ}). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out PSF corrected model fitting photometry for all detected objects. The median 10{sigma} galaxy (point source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6) and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 milli-arcsec. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from 2MASS which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematics floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7% and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta} z/(1+z)=0.054 with an outlier fraction {eta}<5% to z{approx}1. We highlight some selected science results to date and provide a full description of the released data products.

  3. Impact of low intensity summer rainfall on E. coli-discharge event dynamics with reference to sample acquisition and storage.

    PubMed

    Oliver, David M; Porter, Kenneth D H; Heathwaite, A Louise; Zhang, Ting; Quilliam, Richard S

    2015-07-01

    Understanding the role of different rainfall scenarios on faecal indicator organism (FIO) dynamics under variable field conditions is important to strengthen the evidence base on which regulators and land managers can base informed decisions regarding diffuse microbial pollution risks. We sought to investigate the impact of low intensity summer rainfall on Escherichia coli-discharge (Q) patterns at the headwater catchment scale in order to provide new empirical data on FIO concentrations observed during baseflow conditions. In addition, we evaluated the potential impact of using automatic samplers to collect and store freshwater samples for subsequent microbial analysis during summer storm sampling campaigns. The temporal variation of E. coli concentrations with Q was captured during six events throughout a relatively dry summer in central Scotland. The relationship between E. coli concentration and Q was complex with no discernible patterns of cell emergence with Q that were repeated across all events. On several occasions, an order of magnitude increase in E. coli concentrations occurred even with slight increases in Q, but responses were not consistent and highlighted the challenges of attempting to characterise temporal responses of E. coli concentrations relative to Q during low intensity rainfall. Cross-comparison of E. coli concentrations determined in water samples using simultaneous manual grab and automated sample collection was undertaken with no difference in concentrations observed between methods. However, the duration of sample storage within the autosampler unit was found to be more problematic in terms of impacting on the representativeness of microbial water quality, with unrefrigerated autosamplers exhibiting significantly different concentrations of E. coli relative to initial samples after 12-h storage. The findings from this study provide important empirical contributions to the growing evidence base in the field of catchment microbial

  4. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  5. Communication Barriers in Quality Process: Sakarya University Sample

    ERIC Educational Resources Information Center

    Yalcin, Mehmet Ali

    2012-01-01

    Communication has an important role in life and especially in education. Nowadays, lots of people generally use technology for communication. When technology uses in education and other activities, there may be some communication barriers. And also, quality process has an important role in higher education institutes. If a higher education…

  6. Sample interval modulation for the simultaneous acquisition of displacement vector data in magnetic resonance elastography: theory and application.

    PubMed

    Klatt, Dieter; Yasar, Temel K; Royston, Thomas J; Magin, Richard L

    2013-12-21

    SampLe Interval Modulation-magnetic resonance elastography (SLIM-MRE) is introduced for simultaneously encoding all three displacement projections of a monofrequency vibration into the MR signal phase. In SLIM-MRE, the individual displacement components are observed using different sample intervals. In doing so, the components are modulated with different apparent frequencies in the MR signal phase expressed as a harmonic function of the start time of the motion encoding gradients and can thus be decomposed by applying a Fourier transform to the sampled multidirectional MR phases. In this work, the theoretical foundations of SLIM-MRE are presented and the new idea is implemented using a high field (11.7 T) vertical bore magnetic resonance imaging system on an inhomogeneous agarose gel phantom sample. The local frequency estimation-derived stiffness values were the same within the error margins for both the new SLIM-MRE method and for conventional MRE, while the number of temporally-resolved MRE experiments needed for each study was reduced from three to one. In this work, we present for the first time, monofrequency displacement data along three sensitization directions that were acquired simultaneously and stored in the same k-space.

  7. The World of Hidden Biases: From Collection to Sample Processing

    NASA Astrophysics Data System (ADS)

    Maurette, Michel

    Any study of micrometeorites involves a variety of biases, which start right away during their collection, and which have not been suffciently publicized. This section deals with the astonishing folklore of these biases. We shall question whether major differences observed between Antarctic micrometeorites and stratospheric micrometeorites could reflect kinds of complementary biases between the two collections of micrometeorites. Astonishingly, some of them would converge to enrich the SMMs collection in the most fine-grained fluffy dust particles accreted by the Earth. They might be possibly the most primitive material accreted by the Earth. But they would not give a representative sampling of the bulk micrometeorite flux, which is best obtained with the new Concordia micrometeorites collected in central Antarctica. For a change, biases developing around a small metallic plate flying at ~200m/sec in the stratosphere turned out to be quite helpful!

  8. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... Wide Area WorkFlow to process vouchers. DATES: Comments on the proposed rule should be submitted in... updates DoD's internal voucher processing procedures and better accommodates the use of Wide Area...

  9. The approach to sample acquisition and its impact on the derived human fecal microbiome and VOC metabolome.

    PubMed

    Couch, Robin D; Navarro, Karl; Sikaroodi, Masoumeh; Gillevet, Pat; Forsyth, Christopher B; Mutlu, Ece; Engen, Phillip A; Keshavarzian, Ali

    2013-01-01

    Recent studies have illustrated the importance of the microbiota in maintaining a healthy state, as well as promoting disease states. The intestinal microbiota exerts its effects primarily through its metabolites, and metabolomics investigations have begun to evaluate the diagnostic and health implications of volatile organic compounds (VOCs) isolated from human feces, enabled by specialized sampling methods such as headspace solid-phase microextraction (hSPME). The approach to stool sample collection is an important consideration that could potentially introduce bias and affect the outcome of a fecal metagenomic and metabolomic investigation. To address this concern, a comparison of endoscopically collected (in vivo) and home collected (ex vivo) fecal samples was performed, revealing slight variability in the derived microbiomes. In contrast, the VOC metabolomes differ widely between the home collected and endoscopy collected samples. Additionally, as the VOC extraction profile is hyperbolic, with short extraction durations more vulnerable to variation than extractions continued to equilibrium, a second goal of our investigation was to ascertain if hSPME-based fecal metabolomics studies might be biased by the extraction duration employed. As anticipated, prolonged extraction (18 hours) results in the identification of considerably more metabolites than short (20 minute) extractions. A comparison of the metabolomes reveals several analytes deemed unique to a cohort with the 20 minute extraction, but found common to both cohorts when the VOC extraction was performed for 18 hours. Moreover, numerous analytes perceived to have significant fold change with a 20 minute extraction were found insignificant in fold change with the prolonged extraction, underscoring the potential for bias associated with a 20 minute hSPME. PMID:24260553

  10. The Approach to Sample Acquisition and Its Impact on the Derived Human Fecal Microbiome and VOC Metabolome

    PubMed Central

    Couch, Robin D.; Navarro, Karl; Sikaroodi, Masoumeh; Gillevet, Pat; Forsyth, Christopher B.; Mutlu, Ece; Engen, Phillip A.; Keshavarzian, Ali

    2013-01-01

    Recent studies have illustrated the importance of the microbiota in maintaining a healthy state, as well as promoting disease states. The intestinal microbiota exerts its effects primarily through its metabolites, and metabolomics investigations have begun to evaluate the diagnostic and health implications of volatile organic compounds (VOCs) isolated from human feces, enabled by specialized sampling methods such as headspace solid-phase microextraction (hSPME). The approach to stool sample collection is an important consideration that could potentially introduce bias and affect the outcome of a fecal metagenomic and metabolomic investigation. To address this concern, a comparison of endoscopically collected (in vivo) and home collected (ex vivo) fecal samples was performed, revealing slight variability in the derived microbiomes. In contrast, the VOC metabolomes differ widely between the home collected and endoscopy collected samples. Additionally, as the VOC extraction profile is hyperbolic, with short extraction durations more vulnerable to variation than extractions continued to equilibrium, a second goal of our investigation was to ascertain if hSPME-based fecal metabolomics studies might be biased by the extraction duration employed. As anticipated, prolonged extraction (18 hours) results in the identification of considerably more metabolites than short (20 minute) extractions. A comparison of the metabolomes reveals several analytes deemed unique to a cohort with the 20 minute extraction, but found common to both cohorts when the VOC extraction was performed for 18 hours. Moreover, numerous analytes perceived to have significant fold change with a 20 minute extraction were found insignificant in fold change with the prolonged extraction, underscoring the potential for bias associated with a 20 minute hSPME. PMID:24260553

  11. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    ERIC Educational Resources Information Center

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  12. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... production process, e.g., at commencement or completion of significant phases or after storage for long... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sampling and testing of in-process materials...

  13. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques.

    PubMed

    Pannek, Kerstin; Guzzetta, Andrea; Colditz, Paul B; Rose, Stephen E

    2012-10-01

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. PMID:22903761

  14. Lexical Processing and Organization in Bilingual First Language Acquisition: Guiding Future Research

    PubMed Central

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-01-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between two languages in the early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. PMID:26866430

  15. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach.

  16. Lexical processing and organization in bilingual first language acquisition: Guiding future research.

    PubMed

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-06-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between 2 languages in early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. (PsycINFO Database Record

  17. Acquisition process of typing skill using hierarchical materials in the Japanese language.

    PubMed

    Ashitaka, Yuki; Shimada, Hiroyuki

    2014-08-01

    In the present study, using a new keyboard layout with only eight keys, we conducted typing training for unskilled typists. In this task, Japanese college students received training in typing words consisting of a pair of hiragana characters with four keystrokes, using the alphabetic input method, while keeping the association between the keys and typists' finger movements; the task was constructed so that chunking was readily available. We manipulated the association between the hiragana characters and alphabet letters (hierarchical materials: overlapped and nonoverlapped mappings). Our alphabet letter materials corresponded to the regular order within each hiragana word (within the four letters, the first and third referred to consonants, and the second and fourth referred to vowels). Only the interkeystroke intervals involved in the initiation of typing vowel letters showed an overlapping effect, which revealed that the effect was markedly large only during the early period of skill development (the effect for the overlapped mapping being larger than that for the nonoverlapped mapping), but that it had diminished by the time of late training. Conversely, the response time and the third interkeystroke interval, which are both involved in the latency of typing a consonant letter, did not reveal an overlapped effect, suggesting that chunking might be useful with hiragana characters rather than hiragana words. These results are discussed in terms of the fan effect and skill acquisition. Furthermore, we discuss whether there is a need for further research on unskilled and skilled Japanese typists.

  18. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    PubMed Central

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  19. Analysis of protein biomarkers in human clinical tumor samples: critical aspects to success from tissue acquisition to analysis.

    PubMed

    Warren, Madhuri V; Chan, W Y Iris; Ridley, John M

    2011-04-01

    There has been increased interest in the analysis of protein biomarkers in clinical tumor tissues in recent years. Tissue-based biomarker assays can add value and aid decision-making at all stages of drug development, as well as being developed for use as predictive biomarkers and for patient stratification and prognostication in the clinic. However, there must be an awareness of the legal and ethical issues related to the sourcing of human tissue samples. This article also discusses the limits of scope and critical aspects on the successful use of the following tissue-based methods: immunohistochemistry, tissue microarrays and automated image analysis. Future advances in standardization of tissue biobanking methods, immunohistochemistry and quantitative image analysis techniques are also discussed. PMID:21473728

  20. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    ERIC Educational Resources Information Center

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2013-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with…

  1. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  2. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  3. Recovering the spectrum of a narrow-band process from syncopated samples

    NASA Technical Reports Server (NTRS)

    Milenkovic, P. H.

    1979-01-01

    A nonuniform sampling strategy, phase quadrature sampling, in which a process of bandwith B is sampled at rate B in each of two channels where the two channels are pi/2 out of phase at frequency B is described. Phase quadrature sampling is a special case of sampling, where the phase between channels is fixed but arbitrary. A simple method for recovering the spectrum of the input process from syncopated samples is derived. The derivation indicates what values of phase between channels result in lossless sampling.

  4. A multi-threshold sampling method for TOF PET signal processing

    SciTech Connect

    Kim, Heejong; Kao, Chien-Min; Xie, Q.; Chen, Chin-Tu; Zhou, L.; Tang, F.; Frisch, Henry; Moses, William W.; Choong, Woon-Seng

    2009-02-02

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multithreshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to 8 threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25 x 6.25 x 25mm{sup 3} LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an {approx}18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an {approx}9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain {approx}300 ps coincidence timing resolution, {approx}14% energy resolution at 511 keV, and {approx}5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  5. Association of Campylobacter spp. levels between chicken grow-out environmental samples and processed carcasses.

    PubMed

    Schroeder, Matthew W; Eifert, Joseph D; Ponder, Monica A; Schmale, David G

    2014-03-01

    Campylobacter spp. have been isolated from live poultry, production environments, processing facilities, and raw poultry products. Environmental sampling in a poultry grow-out house, combined with carcass rinse sampling from the same flock, may provide a relative relationship between pre- and postharvest Campylobacter contamination. Air samples, fecal/litter samples, and feed/drink line samples were collected from 4 commercial chicken grow-out houses in western Virginia between September 2011 and January 2012. Birds from each sampled house were the first flock slaughtered the following day and were then sampled by postchill carcass rinses. Campylobacter, from postenrichment samples, was detected in 27% (32/120) of house environmental samples and 37.5% (45/120) of carcass rinse samples. All environmental sample types from each house included at least one positive sample except the house 2 air samples. The sponge sample method was found to have a significantly higher (P < 0.05) proportion of Campylobacter-positive samples (45%) than the fecal/litter samples (20%) and air samples (15%) when sample types of all the houses were compared. The proportion positive for the fecal/litter samples postenrichment, for each flock, had the highest correlation (0.85) to the proportion of positive carcass rinse samples for each flock. Environmental samples from house 1 and associated carcass rinses accounted for the largest number of Campylobacter positives (29/60). The fewest number of Campylobacter positives, based on both house environmental (4/30) and carcass rinse samples (8/30), was detected from flock B. The results of this study suggest that environmental sampling in a poultry grow-out house, combined with carcass rinse sampling from the same flock, have the potential to provide an indication of Campylobacter contamination and transmission. Campylobacter qualitative levels from house and processing plant samples may enable the scheduled processing of flocks with lower

  6. Second-first language acquisition: analysis of expressive language skills in a sample of girls adopted from China.

    PubMed

    Tan, Tony Xing; Loker, Troy; Dedrick, Robert F; Marfo, Kofi

    2012-03-01

    In this study we investigated adopted Chinese girls' expressive English language outcomes in relation to their age at adoption, chronological age, length of exposure to English and developmental risk status at the time of adoption. Vocabulary and phrase utterance data on 318 girls were collected from the adoptive mothers using the Language Development Survey (LDS) (Achenbach & Rescorla, 2000). The girls, aged 18-35 months (M=26·2 months, SD=4·9 months), were adopted at ages ranging from 6·8 to 24 months (M=12·6 months, SD=3·1 months), and had been exposed to English for periods ranging from 1·6 to 27·6 months (M=13·7, SD=5·7). Findings suggest that vocabulary and mean length of phrase scores were negatively correlated with age at adoption but positively correlated with chronological age and length of exposure to English. Developmental risk status at the time of adoption was not correlated with language outcomes. The gap between their expressive language and that of same-age girls from the US normative sample was wider for children aged 18-23 months but was closed for children aged 30-35 months. About 16% of the children met the LDS criteria for delays in vocabulary and 17% met the LDS criteria for delays in mean length of phrase. Speech/language interventions were received by 33·3% of the children with delays in vocabulary and 25% with delays in phrase.

  7. FTMP data acquisition environment

    NASA Technical Reports Server (NTRS)

    Padilla, Peter A.

    1988-01-01

    The Fault-Tolerant Multi-Processing (FTMP) test-bed data acquisition environment is described. The performance of two data acquisition devices available in the test environment are estimated and compared. These estimated data rates are used as measures of the devices' capabilities. A new data acquisition device was developed and added to the FTMP environment. This path increases the data rate available by approximately a factor of 8, to 379 KW/S, while simplifying the experiment development process.

  8. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  9. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  10. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  11. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  12. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  13. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  14. Abdominal 4D Flow MR Imaging in a Breath Hold: Combination of Spiral Sampling and Dynamic Compressed Sensing for Highly Accelerated Acquisition

    PubMed Central

    Knight-Greenfield, Ashley; Jajamovich, Guido; Besa, Cecilia; Cui, Yong; Stalder, Aurélien; Markl, Michael; Taouli, Bachir

    2015-01-01

    Purpose To develop a highly accelerated phase-contrast cardiac-gated volume flow measurement (four-dimensional [4D] flow) magnetic resonance (MR) imaging technique based on spiral sampling and dynamic compressed sensing and to compare this technique with established phase-contrast imaging techniques for the quantification of blood flow in abdominal vessels. Materials and Methods This single-center prospective study was compliant with HIPAA and approved by the institutional review board. Ten subjects (nine men, one woman; mean age, 51 years; age range, 30–70 years) were enrolled. Seven patients had liver disease. Written informed consent was obtained from all participants. Two 4D flow acquisitions were performed in each subject, one with use of Cartesian sampling with respiratory tracking and the other with use of spiral sampling and a breath hold. Cartesian two-dimensional (2D) cine phase-contrast images were also acquired in the portal vein. Two observers independently assessed vessel conspicuity on phase-contrast three-dimensional angiograms. Quantitative flow parameters were measured by two independent observers in major abdominal vessels. Intertechnique concordance was quantified by using Bland-Altman and logistic regression analyses. Results There was moderate to substantial agreement in vessel conspicuity between 4D flow acquisitions in arteries and veins (κ = 0.71 and 0.61, respectively, for observer 1; κ = 0.71 and 0.44 for observer 2), whereas more artifacts were observed with spiral 4D flow (κ = 0.30 and 0.20). Quantitative measurements in abdominal vessels showed good equivalence between spiral and Cartesian 4D flow techniques (lower bound of the 95% confidence interval: 63%, 77%, 60%, and 64% for flow, area, average velocity, and peak velocity, respectively). For portal venous flow, spiral 4D flow was in better agreement with 2D cine phase-contrast flow (95% limits of agreement: −8.8 and 9.3 mL/sec, respectively) than was Cartesian 4D flow (95

  15. Skill acquisition with text-entry interfaces: particularly older users benefit from minimized information-processing demands.

    PubMed

    Jahn, Georg; Krems, Josef F

    2013-08-01

    Operating information technology challenges older users if it requires executive control, which generally declines with age. Especially for novel and occasional tasks, cognitive demands can be high. We demonstrate how interface design can reduce cognitive demands by studying skill acquisition with the destination entry interfaces of two customary route guidance systems. Young, middle-aged, and older adults performed manual destination entry either with a system operated with multiple buttons in a dialogue encompassing spelling and list selection, or with a system operated by a single rotary encoder, in which an intelligent speller constrained destination entry to a single line of action. Each participant performed 100 training trials. A retention test after at least 10 weeks encompassed 20 trials. The same task was performed faster, more accurately, and produced much less age-related performance differences especially at the beginning of training if interface design reduced demand for executive control, perceptual processing, and motor control. PMID:25474764

  16. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  17. Hardware acceleration of lucky-region fusion (LRF) algorithm for image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Maignan, William; Koeplinger, David; Carhart, Gary W.; Aubailly, Mathieu; Kiamilev, Fouad; Liu, J. Jiang

    2013-05-01

    "Lucky-region fusion" (LRF) is an image processing technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm extracts sharp regions of an image obtained from a series of short exposure frames, and "fuses" them into a final image with improved quality. In previous research, the LRF algorithm had been implemented on a PC using a compiled programming language. However, the PC usually does not have sufficient processing power to handle real-time extraction, processing and reduction required when the LRF algorithm is applied not to single picture images but rather to real-time video from fast, high-resolution image sensors. This paper describes a hardware implementation of the LRF algorithm on a Virtex 6 field programmable gate array (FPGA) to achieve real-time video processing. The novelty in our approach is the creation of a "black box" LRF video processing system with a standard camera link input, a user controller interface, and a standard camera link output.

  18. Thermal infrared pushbroom imagery acquisition and processing. [of NASA's Advanced Land Observing System

    NASA Technical Reports Server (NTRS)

    Brown, T. J.; Corbett, F. J.; Spera, T. J.; Andrada, T.

    1982-01-01

    A 9-element focal plane detector array and signal processing electronics was developed and delivered in December 1977. It was integrated into a thermal infrared imaging system using LSI microprocessor image processing and CRT display. After three years of laboratory operation, the focal plane has demonstrated high reliability and performance. On the basis of the 9-channel breadboard, the 90-element Aircraft Pushbroom IR/CCD Focal Plane Development Program was funded in October 1977. A follow-on program was awarded in July 1979, for the construction of a field test instrument and image processing facility. The objective of this project was to demonstrate thermal infrared pushbroom hard-copy imagery. It is pointed out that the successful development of the 9-element and 90-element thermal infrared hybrid imaging systems using photoconductive (Hg,Cd)Te has verified the operational concept of 8 to 14 micrometer pushbroom scanners.

  19. How human resource organization can enhance space information acquisition and processing: the experience of the VENESAT-1 ground segment

    NASA Astrophysics Data System (ADS)

    Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica

    2009-12-01

    Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM

  20. Fast multi-dimensional NMR acquisition and processing using the sparse FFT.

    PubMed

    Hassanieh, Haitham; Mayzel, Maxim; Shi, Lixin; Katabi, Dina; Orekhov, Vladislav Yu

    2015-09-01

    Increasing the dimensionality of NMR experiments strongly enhances the spectral resolution and provides invaluable direct information about atomic interactions. However, the price tag is high: long measurement times and heavy requirements on the computation power and data storage. We introduce sparse fast Fourier transform as a new method of NMR signal collection and processing, which is capable of reconstructing high quality spectra of large size and dimensionality with short measurement times, faster computations than the fast Fourier transform, and minimal storage for processing and handling of sparse spectra. The new algorithm is described and demonstrated for a 4D BEST-HNCOCA spectrum. PMID:26123316

  1. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds. PMID:25528287

  2. The acceleration of spoken-word processing in children's native-language acquisition: an ERP cohort study.

    PubMed

    Ojima, Shiro; Matsuba-Kurita, Hiroko; Nakamura, Naoko; Hagiwara, Hiroko

    2011-04-01

    Healthy adults can identify spoken words at a remarkable speed, by incrementally analyzing word-onset information. It is currently unknown how this adult-level speed of spoken-word processing emerges during children's native-language acquisition. In a picture-word mismatch paradigm, we manipulated the semantic congruency between picture contexts and spoken words, and recorded event-related potential (ERP) responses to the words. Previous similar studies focused on the N400 response, but we focused instead on the onsets of semantic congruency effects (N200 or Phonological Mismatch Negativity), which contain critical information for incremental spoken-word processing. We analyzed ERPs obtained longitudinally from two age cohorts of 40 primary-school children (total n=80) in a 3-year period. Children first tested at 7 years of age showed earlier onsets of congruency effects (by approximately 70ms) when tested 2 years later (i.e., at age 9). Children first tested at 9 years of age did not show such shortening of onset latencies 2 years later (i.e., at age 11). Overall, children's onset latencies at age 9 appeared similar to those of adults. These data challenge the previous hypothesis that word processing is well established at age 7. Instead they support the view that the acceleration of spoken-word processing continues beyond age 7.

  3. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds.

  4. Independent Sampling vs Interitem Dependencies in Whole Report Processing: Contributions of Processing Architecture and Variable Attention.

    PubMed

    Busey, Thomas A.; Townsend, James T.

    2001-04-01

    All current models of visual whole report processing assume perceptual independence among the displayed items in which the perceptual processing of individual items is not affected by other items in the display. However, models proposed by Townsend (1981, Acta Psychologica 47, 149-173), Shibuya and Bundesen (1988, Journal of Experimental Psychology: Human Perception and Performance 14, 591-600), and Bundesen (1990, Psychological Review 97, 523-547) contain postperceptual buffers that must predict negative dependencies. The perceptual-independence assumption forms what we term the modal model class. A recent example of a model that assumes perceptual independence is the Independent Sampling Model of Loftus, Busey, and Senders (1993, Perception and Psychophysics 54, 535-554). The fundamental independence assumption has only been directly tested once before, where tests revealed no dependencies except those produced by guessing. The present study tests the independence assumption using several different statistics and, contrary to most extant models of whole report, finds significant positive dependence. Poisson models do predict a positive dependence and we develop a succinctly parameterized version, the Weighted Path Poisson Model, which allows the finishing order to be a weighted probabilistic mechanism. However, it does not predict the data quite as well as a new model, the Variable Attention Model, which allows independence within trials (unlike the Poisson models). This model assumes that attention (or, potentially, other aspects such as signal quality) varies widely across trials, thus predicting an overall positive dependence. Intuitions for and against the competing models are discussed. In addition, we show, through mimicking formulae, that models which contain the proper qualitative type of dependence structure can be cast in either serial or parallel form. Copyright 2001 Academic Press.

  5. The Effectiveness of Processing Instruction in L2 Grammar Acquisition: A Narrative Review

    ERIC Educational Resources Information Center

    Dekeyser, Robert; Botana, Goretti Prieto

    2015-01-01

    The past two decades have seen ample debate about processing instruction (PI) and its various components. In this article, we first describe what PI consists of and then address three questions: about the role of explicit information (EI) in PI, the difference between PI and teaching that incorporates production-based (PB) practice, and various…

  6. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    ERIC Educational Resources Information Center

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  7. The RFP Process: Effective Management of the Acquisition of Library Materials.

    ERIC Educational Resources Information Center

    Wilkinson, Frances C.; Thorson, Connie Capers

    Many librarians view procurement, with its myriad forms, procedures, and other organizational requirements, as a tedious or daunting challenge. This book simplifies the process, showing librarians how to successfully prepare a Request for Proposal (RFP) and make informed decisions when determining which vendors to use for purchasing library…

  8. Using Eye-Tracking to Investigate Topics in L2 Acquisition and L2 Processing

    ERIC Educational Resources Information Center

    Roberts, Leah; Siyanova-Chanturia, Anna

    2013-01-01

    Second language (L2) researchers are becoming more interested in both L2 learners' knowledge of the target language and how that knowledge is put to use during real-time language processing. Researchers are therefore beginning to see the importance of combining traditional L2 research methods with those that capture the moment-by-moment…

  9. Production and Processing Asymmetries in the Acquisition of Tense Morphology by Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Chondrogianni, Vasiliki; Marinis, Theodoros

    2012-01-01

    This study investigates the production and online processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty-nine six- to nine-year-old L2 children and twenty-eight typically developing age-matched monolingual (L1) children were administered the production…

  10. Analyzing Preschoolers' Overgeneralizations of Object Labeling in the Process of Mother-Tongue Acquisition in Turkey

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2006-01-01

    Language, as is known, is acquired under certain conditions: rapid and sequential brain maturation and cognitive development, the need to exchange information and to control others' actions, and an exposure to appropriate speech input. This research aims at analyzing preschoolers' overgeneralizations of the object labeling process in different…

  11. The Effect of Processing Instruction and Dictogloss Tasks on Acquisition of the English Passive Voice

    ERIC Educational Resources Information Center

    Qin, Jingjing

    2008-01-01

    This study was intended to compare processing instruction (VanPatten, 1993, 1996, 2000), an input-based focus on form technique, to dictogloss tasks, an output-oriented focus-on-form type of instruction to assess their effects in helping beginning-EFL (English as a Foreign Language) learners acquire the simple English passive voice. Two intact…

  12. An architecture for real time data acquisition and online signal processing for high throughput tandem mass spectrometry

    SciTech Connect

    Shah, Anuj R.; Jaitly, Navdeep; Zuljevic, Nino; Monroe, Matthew E.; Liyu, Andrei V.; Polpitiya, Ashoka D.; Adkins, Joshua N.; Belov, Mikhail E.; Anderson, Gordon A.; Smith, Richard D.; Gorton, Ian

    2010-12-09

    Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through “intelligent,” online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving “intelligent,” online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an “intelligent” online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.

  13. Human resource processes and the role of the human resources function during mergers and acquisitions in the electricity industry

    NASA Astrophysics Data System (ADS)

    Dass, Ted K.

    Mergers and acquisitions (M&A) have been a popular strategy for organizations to consolidate and grow for more than a century. However, research in this field indicates that M&A are more likely to fail than succeed, with failure rates estimated to be as high as 75%. People-related issues have been identified as important causes for the high failure rate, but these issues are largely neglected until after the deal is closed. One explanation for this neglect is the low involvement of human resource (HR) professionals and the HR function during the M&A process. The strategic HR management literature suggests that a larger role for HR professionals in the M&A process would enable organizations to identify potential problems early and devise appropriate solutions. However, empirical research from an HR perspective has been scarce in this area. This dissertation examines the role of the HR function and the HR processes followed in organizations during M&A. Employing a case-study research design, this study examines M&A undertaken by two large organizations in the electricity industry through the lens of a "process" perspective. Based on converging evidence, the case studies address three sets of related issues: (1) how do organizations undertake and manage M&A; (2) what is the extent of HR involvement in M&A and what role does it play in the M&A process; and (3) what factors explain HR involvement in the M&A process and, more generally, in the formulation of corporate goals and strategies. Results reveal the complexity of issues faced by organizations in undertaking M&A, the variety of roles played by HR professionals, and the importance of several key contextual factors---internal and external to the organization---that influence HR involvement in the M&A process. Further, several implications for practice and future research are explored.

  14. Intonational Phrase Structure Processing at Different Stages of Syntax Acquisition: ERP Studies in 2-, 3-, and 6-Year-Old Children

    ERIC Educational Resources Information Center

    Mannel, Claudia; Friederici, Angela D.

    2011-01-01

    This study explored the electrophysiology underlying intonational phrase processing at different stages of syntax acquisition. Developmental studies suggest that children's syntactic skills advance significantly between 2 and 3 years of age. Here, children of three age groups were tested on phrase-level prosodic processing before and after this…

  15. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  16. Success or Failure of Automated Data Processing Systems in Physicians' Offices after System Acquisition

    PubMed Central

    Dahm, Lisa L.

    1983-01-01

    Although many sources exist for gleaning information relative to acquiring a data processing system, less material is available on the subject of what the purchaser may expect and must do following the sale. The ingredients for successfully automating a medical practice include: a good plan for the conversion and on-going use of the automated system; proper training initially and plans for future training should the need arise; proper physical facilities; and a positive and cooperative attitude.

  17. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  18. The effects of an action's "age-of-acquisition" on action-sentence processing.

    PubMed

    Gilead, Michael; Liberman, Nira; Maril, Anat

    2016-11-01

    How does our brain allow us comprehend abstract/symbolic descriptions of human action? Whereas past research suggested that processing action language relies on sensorimotor brain regions, recent work suggests that sensorimotor activation depends on participants' task goals, such that focusing on abstract (vs. concrete) aspects of an action activates "default mode network" (rather than sensorimotor) regions. Following a Piagetian framework, we hypothesized that for actions acquired at an age wherein abstract/symbolic cognition is fully-developed, even when participants focus on the concrete aspects of an action, they should retrieve abstract-symbolic mental representations. In two studies, participants processed the concrete (i.e., "how") and abstract (i.e., "why") aspects of late-acquired and early-acquired actions. Consistent with previous research, focusing on the abstract (vs. concrete) aspects of an action resulted in greater activation in the "default mode network". Importantly, the activation in these regions was higher when processing later-acquired (vs. earlier acquired) actions-also when participants' goal was to focus on the concrete aspects of the action. We discuss the implications of the current findings to research on the involvement of concrete representations in abstract cognition. PMID:27431759

  19. Processing Temporal Constraints and Some Implications for the Investigation of Second Language Sentence Processing and Acquisition. Commentary on Baggio

    ERIC Educational Resources Information Center

    Roberts, Leah

    2008-01-01

    Baggio presents the results of an event-related potential (ERP) study in which he examines the processing consequences of reading tense violations such as *"Afgelopen zondag lakt Vincent de kozijnen van zijn landhuis" (*"Last Sunday Vincent paints the window-frames of his country house"). The violation is arguably caused by a mismatch between the…

  20. In-process weld sampling during hot end welds of type W overpacks

    SciTech Connect

    Barnes, G.A.

    1998-08-27

    Establish the criteria and process controls to be used in obtaining, testing, and evaluating in-process weld sample during the hot end welding of Type W Overpack capsules used to overpack CsCl capsules for storage at WESF.

  1. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  2. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received.

  3. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Astrophysics Data System (ADS)

    Li, Y. T.; Wittenberg, L. J.

    1992-09-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  4. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Technical Reports Server (NTRS)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  5. Age of second language acquisition affects nonverbal conflict processing in children: an fMRI study

    PubMed Central

    Mohades, Seyede Ghazal; Struys, Esli; Van Schuerbeek, Peter; Baeken, Chris; Van De Craen, Piet; Luypaert, Robert

    2014-01-01

    Background In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus–stimulus and stimulus–response conflicts. Three groups of 8–11-year-old children – bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) – were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life. PMID:25328840

  6. An overview of AmeriFlux data products and methods for data acquisition, processing, and publication

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.

    2014-12-01

    The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks

  7. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  8. Identification of potential biases in the characterization sampling and analysis process

    SciTech Connect

    Winkelman, W.D.; Eberlein, S.J.

    1995-12-01

    The Tank Waste Remediation System (TWRS) Characterization Project is responsible for providing quality characterization data to TWRS. Documentation of sampling and analysis process errors and biases can be used to improve the process to provide that data. The sampling and analysis process consists of removing a sample from a specified waste tank, getting it to the laboratory and analyzing it to provide the data identified in the Tank Characterization Plan (TCP) and Sampling and Analysis Plan (SAP). To understand the data fully, an understanding of errors or biases that can be generated during the process is necessary. Most measurement systems have the ability statistically to detect errors and biases by using standards and alternate measurement techniques. Only the laboratory analysis part of the tank sampling and analysis process at TWRS has this ability. Therefore, it is necessary to use other methods to identify and prioritize the biases involved in the process.

  9. NREL Develops Accelerated Sample Activation Process for Hydrogen Storage Materials (Fact Sheet)

    SciTech Connect

    Not Available

    2010-12-01

    This fact sheet describes NREL's accomplishments in developing a new sample activation process that reduces the time to prepare samples for measurement of hydrogen storage from several days to five minutes and provides more uniform samples. Work was performed by NREL's Chemical and Materials Science Center.

  10. Acquisition and processing pitfall with clipped traces in surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Gao, Lingli; Pan, Yudi

    2016-02-01

    Multichannel analysis of surface waves (MASW) is widely used in estimating near-surface shear (S)-wave velocity. In the MASW method, generating a reliable dispersion image in the frequency-velocity (f-v) domain is an important processing step. A locus along peaks of dispersion energy at different frequencies allows the dispersion curves to be constructed for inversion. When the offsets are short, the output seismic data may exceed the dynamic ranges of geophones/seismograph, as a result of which, peaks and (or) troughs of traces will be squared off in recorded shot gathers. Dispersion images generated by the raw shot gathers with clipped traces would be contaminated by artifacts, which might be misidentified as Rayleigh-wave phase velocities or body-wave velocities and potentially lead to incorrect results. We performed some synthetic models containing clipped traces, and analyzed amplitude spectra of unclipped and clipped waves. The results indicate that artifacts in the dispersion image are dependent on the level of clipping. A real-world example also shows how clipped traces would affect the dispersion image. All the results suggest that clipped traces should be removed from the shot gathers before generating dispersion images, in order to pick accurate phase velocities and set reasonable initial inversion models.

  11. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalisedcross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  12. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalised cross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  13. Sample processing considerations for detecting copy number changes in formalin-fixed, paraffin-embedded tissues.

    PubMed

    Jacobs, Sharoni

    2012-11-01

    The Whole Genome Sampling Analysis (WGSA) assay in combination with Affymetrix GeneChip Mapping Arrays is used for copy number analysis of high-quality DNA samples (i.e., samples that have been collected from blood, fresh or frozen tissue, or cell lines). Formalin-fixed, paraffin-embedded (FFPE) samples, however, represent the most prevalent form of archived clinical samples, but they provide additional challenges for molecular assays. FFPE processing usually results in the degradation of FFPE DNA and in the contamination and chemical modification of these DNA samples. Because of these issues, FFPE DNA is not suitable for all molecular assays designed for high-quality DNA samples. Strategies recommended for processing FFPE DNA samples through WGSA and to the Mapping arrays are described here. PMID:23118355

  14. Sample processing considerations for detecting copy number changes in formalin-fixed, paraffin-embedded tissues.

    PubMed

    Jacobs, Sharoni

    2012-11-01

    The Whole Genome Sampling Analysis (WGSA) assay in combination with Affymetrix GeneChip Mapping Arrays is used for copy number analysis of high-quality DNA samples (i.e., samples that have been collected from blood, fresh or frozen tissue, or cell lines). Formalin-fixed, paraffin-embedded (FFPE) samples, however, represent the most prevalent form of archived clinical samples, but they provide additional challenges for molecular assays. FFPE processing usually results in the degradation of FFPE DNA and in the contamination and chemical modification of these DNA samples. Because of these issues, FFPE DNA is not suitable for all molecular assays designed for high-quality DNA samples. Strategies recommended for processing FFPE DNA samples through WGSA and to the Mapping arrays are described here.

  15. eL-Chem Viewer: A Freeware Package for the Analysis of Electroanalytical Data and Their Post-Acquisition Processing

    PubMed Central

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-01-01

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, ‘eL-Chem Viewer’, which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm. PMID:25090415

  16. eL-Chem Viewer: a freeware package for the analysis of electroanalytical data and their post-acquisition processing.

    PubMed

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-07-31

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, 'eL-Chem Viewer', which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm.

  17. Perfluoroalkyl Acid Concentrations in Blood Samples Subjected to Transportation and Processing Delay

    PubMed Central

    Bach, Cathrine Carlsen; Henriksen, Tine Brink; Bossi, Rossana; Bech, Bodil Hammer; Fuglsang, Jens; Olsen, Jørn; Nohr, Ellen Aagaard

    2015-01-01

    Background In studies of perfluoroalkyl acids, the validity and comparability of measured concentrations may be affected by differences in the handling of biospecimens. We aimed to investigate whether measured plasma levels of perfluoroalkyl acids differed between blood samples subjected to delay and transportation prior to processing and samples with immediate processing and freezing. Methods Pregnant women recruited at Aarhus University Hospital, Denmark, (n = 88) provided paired blood samples. For each pair of samples, one was immediately processed and plasma was frozen, and the other was delayed and transported as whole blood before processing and freezing of plasma (similar to the Danish National Birth Cohort). We measured 12 perfluoroalkyl acids and present results for compounds with more than 50% of samples above the lower limit of quantification. Results For samples taken in the winter, relative differences between the paired samples ranged between -77 and +38% for individual perfluoroalkyl acids. In most cases concentrations were lower in the delayed and transported samples, e.g. the relative difference was -29% (95% confidence interval -30; -27) for perfluorooctane sulfonate. For perfluorooctanoate there was no difference between the two setups [corresponding estimate 1% (0, 3)]. Differences were negligible in the summer for all compounds. Conclusions Transport of blood samples and processing delay, similar to conditions applied in some large, population-based studies, may affect measured perfluoroalkyl acid concentrations, mainly when outdoor temperatures are low. Attention to processing conditions is needed in studies of perfluoroalkyl acid exposure in humans. PMID:26356420

  18. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  19. IECON '87: Signal acquisition and processing; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Astrophysics Data System (ADS)

    Niederjohn, Russell J.

    1987-01-01

    Theoretical and applications aspects of signal processing are examined in reviews and reports. Topics discussed include speech processing methods, algorithms, and architectures; signal-processing applications in motor and power control; digital signal processing; signal acquisition and analysis; and processing algorithms and applications. Consideration is given to digital coding of speech algorithms, an algorithm for continuous-time processes in discrete-time measurement, quantization noise and filtering schemes for digital control systems, distributed data acquisition for biomechanics research, a microcomputer-based differential distance and velocity measurement system, velocity observations from discrete position encoders, a real-time hardware image preprocessor, and recognition of partially occluded objects by a knowledge-based system.

  20. Evaluation of process errors in bed load sampling using a dune model

    USGS Publications Warehouse

    Gomez, B.; Troutman, B.M.

    1997-01-01

    Reliable estimates of the streamwide bed load discharge obtained using sampling devices are dependent upon good at-a-point knowledge across the full width of the channel. Using field data and information derived from a model that describes the geometric features of a dune train in terms of a spatial process observed at a fixed point in time, we show that sampling errors decrease as the number of samples collected increases, and the number of traverses of the channel over which the samples are collected increases. It also is preferable that bed load sampling be conducted at a pace which allows a number of bed forms to pass through the sampling cross section. The situations we analyze and simulate pertain to moderate transport conditions in small rivers. In such circumstances, bed load sampling schemes typically should involve four or five traverses of a river, and the collection of 20-40 samples at a rate of five or six samples per hour. By ensuring that spatial and temporal variability in the transport process is accounted for, such a sampling design reduces both random and systematic errors and hence minimizes the total error involved in the sampling process.

  1. NMDA Receptor-Dependent Processes in the Medial Prefrontal Cortex Are Important for Acquisition and the Early Stage of Consolidation during Trace, but Not Delay Eyeblink Conditioning

    ERIC Educational Resources Information Center

    Takehara-Nishiuchi, Kaori; Kawahara, Shigenori; Kirino, Yutaka

    2005-01-01

    Permanent lesions in the medial prefrontal cortex (mPFC) affect acquisition of conditioned responses (CRs) during trace eyeblink conditioning and retention of remotely acquired CRs. To clarify further roles of the mPFC in this type of learning, we investigated the participation of the mPFC in mnemonic processes both during and after daily…

  2. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  3. Since When or How Often? Dissociating the Roles of Age of Acquisition (AoA) and Lexical Frequency in Early Visual Word Processing

    ERIC Educational Resources Information Center

    Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado

    2013-01-01

    The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…

  4. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  5. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  6. Variation in the application of natural processes: language-dependent constraints in the phonological acquisition of bilingual children.

    PubMed

    Faingold, E D

    1996-09-01

    This paper studies phonological processes and constraints on early phonological and lexical development, as well as the strategies employed by a young Spanish-, Portuguese-, and Hebrew-speaking child-Nurit (the author's niece)-in the construction of her early lexicon. Nurit's linguistic development is compared to that of another Spanish-, Portuguese-, and Hebrew-speaking child-Noam (the author's son). Noam and Nurit's linguistic development is contrasted to that of Berman's (1977) English- and Hebrew-speaking daughter (Shelli). The simultaneous acquisition of similar (closely related languages) such as Spanish and Portuguese versus that of nonrelated languages such as English and Hebrew yields different results: Children acquiring similar languages seem to prefer maintenance as a strategy for the construction of their early lexicon, while children exposed to nonrelated languages appear to prefer reduction to a large extent (Faingold, 1990). The Spanish- and Portuguese-speaking children's high accuracy stems from a wider choice of target words, where the diachronic development of two closely related languages provides a simplified model lexicon to the child. PMID:8865623

  7. Integrated Processing of High Resolution Topographic Data for Soil Erosion Assessment Considering Data Acquisition Schemes and Surface Properties

    NASA Astrophysics Data System (ADS)

    Eltner, A.; Schneider, D.; Maas, H.-G.

    2016-06-01

    Soil erosion is a decisive earth surface process strongly influencing the fertility of arable land. Several options exist to detect soil erosion at the scale of large field plots (here 600 m²), which comprise different advantages and disadvantages depending on the applied method. In this study, the benefits of unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are exploited to quantify soil surface changes. Beforehand data combination, TLS data is co-registered to the DEMs generated with UAV photogrammetry. TLS data is used to detect global as well as local errors in the DEMs calculated from UAV images. Additionally, TLS data is considered for vegetation filtering. Complimentary, DEMs from UAV photogrammetry are utilised to detect systematic TLS errors and to further filter TLS point clouds in regard to unfavourable scan geometry (i.e. incidence angle and footprint) on gentle hillslopes. In addition, surface roughness is integrated as an important parameter to evaluate TLS point reliability because of the increasing footprints and thus area of signal reflection with increasing distance to the scanning device. The developed fusion tool allows for the estimation of reliable data points from each data source, considering the data acquisition geometry and surface properties, to finally merge both data sets into a single soil surface model. Data fusion is performed for three different field campaigns at a Mediterranean field plot. Successive DEM evaluation reveals continuous decrease of soil surface roughness, reappearance of former wheel tracks and local soil particle relocation patterns.

  8. Dumand-array data-acquisition system

    SciTech Connect

    Brenner, A.E.; Theriot, D.; Dau, W.D.; Geelhood, B.D.; Harris, F.; Learned, J.G.; Stenger, V.; March, R.; Roos, C.; Shumard, E.

    1982-04-01

    An overall data acquisition approach for DUMAND is described. The scheme assumes one array to shore optical fiber transmission line for each string of the array. The basic event sampling period is approx. 13 ..mu..sec. All potentially interesting data is transmitted to shore where the major processing is performed.

  9. Understanding scaling through history-dependent processes with collapsing sample space

    PubMed Central

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2015-01-01

    History-dependent processes are ubiquitous in natural and social systems. Many such stochastic processes, especially those that are associated with complex systems, become more constrained as they unfold, meaning that their sample space, or their set of possible outcomes, reduces as they age. We demonstrate that these sample-space-reducing (SSR) processes necessarily lead to Zipf’s law in the rank distributions of their outcomes. We show that by adding noise to SSR processes the corresponding rank distributions remain exact power laws, p(x)∼x−λ, where the exponent directly corresponds to the mixing ratio of the SSR process and noise. This allows us to give a precise meaning to the scaling exponent in terms of the degree to which a given process reduces its sample space as it unfolds. Noisy SSR processes further allow us to explain a wide range of scaling exponents in frequency distributions ranging from α=2 to ∞. We discuss several applications showing how SSR processes can be used to understand Zipf’s law in word frequencies, and how they are related to diffusion processes in directed networks, or aging processes such as in fragmentation processes. SSR processes provide a new alternative to understand the origin of scaling in complex systems without the recourse to multiplicative, preferential, or self-organized critical processes. PMID:25870294

  10. Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper

    PubMed Central

    Smith, LM; Burgoyne, LA

    2004-01-01

    Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler. PMID:15072582

  11. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  12. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  13. Unexpected toxicity to aquatic organisms of some aqueous bisphenol A samples treated by advanced oxidation processes.

    PubMed

    Tišler, Tatjana; Erjavec, Boštjan; Kaplan, Renata; Şenilă, Marin; Pintar, Albin

    2015-01-01

    In this study, photocatalytic and catalytic wet-air oxidation (CWAO) processes were used to examine removal efficiency of bisphenol A from aqueous samples over several titanate nanotube-based catalysts. Unexpected toxicity of bisphenol A (BPA) samples treated by means of the CWAO process to some tested species was determined. In addition, the CWAO effluent was recycled five- or 10-fold in order to increase the number of interactions between the liquid phase and catalyst. Consequently, the inductively coupled plasma mass spectrometry (ICP-MS) analysis indicated higher concentrations of some toxic metals like chromium, nickel, molybdenum, silver, and zinc in the recycled samples in comparison to both the single-pass sample and the photocatalytically treated solution. The highest toxicity of five- and 10-fold recycled solutions in the CWAO process was observed in water fleas, which could be correlated to high concentrations of chromium, nickel, and silver detected in tested samples. The obtained results clearly demonstrated that aqueous samples treated by means of advanced oxidation processes should always be analyzed using (i) chemical analyses to assess removal of BPA and total organic carbon from treated aqueous samples, as well as (ii) a battery of aquatic organisms from different taxonomic groups to determine possible toxicity. PMID:26114268

  14. Capillary absorption spectrometer and process for isotopic analysis of small samples

    DOEpatents

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  15. Three-Step Validation of Exercise Behavior Processes of Change in an Adolescent Sample

    ERIC Educational Resources Information Center

    Rhodes, Ryan E.; Berry, Tanya; Naylor, Patti-Jean; Higgins, S. Joan Wharf

    2004-01-01

    Though the processes of change are conceived as the core constructs of the transtheoretical model (TTM), few researchers have examined their construct validity in the physical activity domain. Further, only 1 study was designed to investigate the processes of change in an adolescent sample. The purpose of this study was to examine the exercise…

  16. Effect of histologic processing on dimensions of skin samples obtained from cat cadavers.

    PubMed

    Jeyakumar, Sakthila; Smith, Annette N; Schleis, Stephanie E; Cattley, Russell C; Tillson, D Michael; Henderson, Ralph A

    2015-11-01

    OBJECTIVE To determine changes in dimensions of feline skin samples as a result of histologic processing and to identify factors that contributed to changes in dimensions of skin samples after sample collection. SAMPLE Cadavers of 12 clinically normal cats. PROCEDURES Skin samples were obtained bilaterally from 3 locations (neck, thorax, and tibia) of each cadaver; half of the thoracic samples included underlying muscle. Length, width, and depth were measured at 5 time points (before excision, after excision, after application of ink to mark tissue margins, after fixation in neutral-buffered 10% formalin for 36 hours, and after completion of histologic processing and staining with H&E stain). Measurements obtained after sample collection were compared with measurements obtained before excision. RESULTS At the final time point, tissue samples had decreased in length (mean decrease, 32.40%) and width (mean decrease, 34.21%) and increased in depth (mean increase, 54.95%). Tissue from the tibia had the most shrinkage in length and width and that from the neck had the least shrinkage. Inclusion of underlying muscle on thoracic skin samples did not affect the degree of change in dimensions. CONCLUSIONS AND CLINICAL RELEVANCE In this study, each step during processing from excision to formalin fixation and histologic processing induced changes in tissue dimensions, which were manifested principally as shrinkage in length and width and increase in depth. Most of the changes occured during histologic processing. Inclusion of muscle did not affect thoracic skin shrinkage. Shrinkage should be a consideration when interpreting surgical margins in clinical cases. 945). PMID:26512538

  17. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  18. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  19. Down sampled signal processing for a B Factory bunch-by-bunch feedback system

    SciTech Connect

    Hindi, H.; Hosseini, W.; Briggs, D.; Fox, J.; Hutton, A.

    1992-03-01

    A bunch-by-bunch feedback scheme is studied for damping coupled bunch synchrotron oscillations in the proposed PEP II B Factory. The quasi-linear feedback systems design incorporates a phase detector to provide a quantized measure of bunch phase, digital signal processing to compute an error correction signal and a kicker system to correct the energy of the bunches. A farm of digital processors, operating in parallel, is proposed to compute correction signals for the 1658 bunches of the B Factory. This paper studies the use of down sampled processing to reduce the computational complexity of the feedback system. We present simulation results showing the effect of down sampling on beam dynamics. Results show that down sampled processing can reduce the scale of the processing task by a factor of 10.

  20. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY'S (DWPF) PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 7A QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Edwards, T.; Jones, M.; Wiedenman, B.

    2011-03-14

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO{sub 3} acid dissolution (i.e., DWPF Cold Chem Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestions of Sludge Batch 7a (SB7a) SRAT Receipt and SB7a SRAT Product samples. The SB7a SRAT Receipt and SB7a SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constituates the SB7a Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 6 (SB6), to form the Sb7a Blend composition.

  1. Technical Note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, T.; Bach, L. T.; Czerny, J.; Riebesell, U.

    2015-11-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial decoupling between particle formation and collection often handicaps reconciliation of these two processes even within the euphotic zone. Pelagic mesocosms have the advantage of being closed systems and are therefore ideally suited to study how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently concentrated by passive settling, centrifugation or flocculation with ferric chloride and we discuss the advantages of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranges from fine to coarse silt (2-63 μm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements and even at very low particle fluxes we were able to get a detailed insight on various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  2. Sampling plan optimization for detection of lithography and etch CD process excursions

    NASA Astrophysics Data System (ADS)

    Elliott, Richard C.; Nurani, Raman K.; Lee, Sung Jin; Ortiz, Luis G.; Preil, Moshe E.; Shanthikumar, J. G.; Riley, Trina; Goodwin, Greg A.

    2000-06-01

    Effective sample planning requires a careful combination of statistical analysis and lithography engineering. In this paper, we present a complete sample planning methodology including baseline process characterization, determination of the dominant excursion mechanisms, and selection of sampling plans and control procedures to effectively detect the yield- limiting excursions with a minimum of added cost. We discuss the results of our novel method in identifying critical dimension (CD) process excursions and present several examples of poly gate Photo and Etch CD excursion signatures. Using these results in a Sample Planning model, we determine the optimal sample plan and statistical process control (SPC) chart metrics and limits for detecting these excursions. The key observations are that there are many different yield- limiting excursion signatures in photo and etch, and that a given photo excursion signature turns into a different excursion signature at etch with different yield and performance impact. In particular, field-to-field variance excursions are shown to have a significant impact on yield. We show how current sampling plan and monitoring schemes miss these excursions and suggest an improved procedure for effective detection of CD process excursions.

  3. How can wireless, mobile data acquisition be used for taking part of the lab to the sample, and how can it join the internet of things?

    NASA Astrophysics Data System (ADS)

    Trzcinski, Peter; Karanassios, Vassili

    2016-05-01

    During the last several years, the world has moved from wired communications (e.g., a wired ethernet, wired telephone) to wireless communications (e.g., cell phones, smart phones, tablets). However, data acquisition has lagged behind and for the most part, data in laboratory settings are still acquired using wired communications (or even plug in boards). In this paper, approaches that can be used for wireless data acquisition are briefly discussed using a conceptual model of a future, mobile, portable micro-instrument as an example. In addition, past, present and near-future generations of communications are discussed; processors, operating systems and benchmarks are reviewed; networks that may be used for data acquisition in the field are examined; and, the possibility of connecting sensor or micro-instrument networks to the internet of things is postulated.

  4. Energy from true in situ processing of Antrim shale: sampling and analytical systems

    SciTech Connect

    Pihlaja, R.K.

    1980-08-01

    Reliable on-line analysis of production gas composition is fundamental to the success of an in situ extraction experiment in Antrim shale. An automated sampling and analysis system designed to meet this need has provided high quality analytical data for three extraction trials without a single day when no data were taken. The production gas samples were routinely analyzed by both gas chromatography (GC) and a bank of continuous on-line process gas analyzers. The GC's analyzed for H/sub 2/, O/sub 2/ + Ar, N/sub 2/, CO, CO/sub 2/, SO/sub 2/, H/sub 2/S, individual C/sub 1/ - C/sub 5/ hydrocarbon species, and lumped C/sub 6/ + hydrocarbon species, each analysis requiring up to an hour to run. The process gas analyzers measured CO, CO/sub 2/, total hydrocarbons (% vol CH/sub 4/ equivalent), and O/sub 2/ continuously. The process gas analyzers were shown to be especially well suited for this application because of their fast response. The GC data provided itemized composition details as well as an independent check of process analyzer data. Sample selection, data collection and processing from both the GC's and process gas analyzers was handled by a Perkin Elmer Sigma-10 minicomputer. The combination of the two analytical techniques and automated data handling yielded a versatile and powerful system. The production gas sampling system demonstrated the feasibility of transmitting a properly treated gas sample through a long (1000 ft) 1/8'' diameter sample line. The small bore tubing allowed the analytical instruments to be located a safe distance away from the well heads and yet maintain a reasonably short sample transport lag time without handling large volumes of gas.

  5. Random Sampling Process Leads to Overestimation of β-Diversity of Microbial Communities

    PubMed Central

    Zhou, Jizhong; Jiang, Yi-Huei; Deng, Ye; Shi, Zhou; Zhou, Benjamin Yamin; Xue, Kai; Wu, Liyou; He, Zhili; Yang, Yunfeng

    2013-01-01

    ABSTRACT The site-to-site variability in species composition, known as β-diversity, is crucial to understanding spatiotemporal patterns of species diversity and the mechanisms controlling community composition and structure. However, quantifying β-diversity in microbial ecology using sequencing-based technologies is a great challenge because of a high number of sequencing errors, bias, and poor reproducibility and quantification. Herein, based on general sampling theory, a mathematical framework is first developed for simulating the effects of random sampling processes on quantifying β-diversity when the community size is known or unknown. Also, using an analogous ball example under Poisson sampling with limited sampling efforts, the developed mathematical framework can exactly predict the low reproducibility among technically replicate samples from the same community of a certain species abundance distribution, which provides explicit evidences of random sampling processes as the main factor causing high percentages of technical variations. In addition, the predicted values under Poisson random sampling were highly consistent with the observed low percentages of operational taxonomic unit (OTU) overlap (<30% and <20% for two and three tags, respectively, based on both Jaccard and Bray-Curtis dissimilarity indexes), further supporting the hypothesis that the poor reproducibility among technical replicates is due to the artifacts associated with random sampling processes. Finally, a mathematical framework was developed for predicting sampling efforts to achieve a desired overlap among replicate samples. Our modeling simulations predict that several orders of magnitude more sequencing efforts are needed to achieve desired high technical reproducibility. These results suggest that great caution needs to be taken in quantifying and interpreting β-diversity for microbial community analysis using next-generation sequencing technologies. PMID:23760464

  6. Studies on the Electro-Polishing process with Nb sample plates at KEK

    SciTech Connect

    Saeki, Takayuki; Funahashi, Y.; Hayano, Hitoshi; Kato, Seigo; Nishiwaki, Michiru; Sawabe, Motoaki; Ueno, Kenji; Watanabe, K.; Clemens, William A.; Geng, Rongli; Manus, Robert L.; Tyagi, Puneet

    2009-11-01

    In this article, two subjects would be described. the first subject is on the production of stains on the surface of Nb sample plates in Electro-polishing (EP) process and the second subject is on the development of defects/pits in the EP process on the surface of a Nb sample plate. Recently, some 9-cell cavities were treated with new EP acid at KEK and the performance of these cavities were limited by heavy field emissions. On the inside surface of these cavities, brown stains were observed. We made an effort to reproduce the brown stains on Nb sample plates with an EP setup in laboratory with varying the concentration of Nibium in the EP acid. We found that the brown stains would appear only when processed with new EP acid. In the second subject, we made artificial pits on the surface of a Nb-sample plate and observed the development of the pits after each step of 30um-EP process where 120um was removed in total by the EP process. This article describes these series EP-tests with Nb sample plates at KEK.

  7. Extended Characterization of Chemical Processes in Hot Cells Using Environmental Swipe Samples

    SciTech Connect

    Olsen, Khris B.; Mitroshkov, Alexandre V.; Thomas, M-L; Lepel, Elwood A.; Brunson, Ronald R.; Ladd-Lively, Jennifer

    2012-09-15

    Environmental sampling is used extensively by the International Atomic Energy Agency (IAEA) for verification of information from State declarations or a facility’s design regarding nuclear activities occurring within the country or a specific facility. Environmental sampling of hot cells within a facility under safeguards is conducted using 10.2 cm x 10.2 cm cotton swipe material or cellulose swipes. Traditional target analytes used by the IAEA to verify operations within a facility include a select list of gamma-emitting radionuclides and total and isotopic U and Pu. Analysis of environmental swipe samples collected within a hot-cell facility where chemical processing occurs may also provide information regarding specific chemicals used in fuel processing. However, using swipe material to elucidate what specific chemical processes were/are being used within a hot cell has not been previously evaluated. Staff from Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory (ORNL) teamed to evaluate the potential use of environmental swipe samples as collection media for volatile and semivolatile organic compounds. This evaluation was initiated with sample collection during a series of Coupled End-to-End (CETE) reprocessing runs at ORNL. The study included measurement of gamma emitting radionuclides, total and isotopic U and Pu, and volatile and semivolatile organic compounds. These results allowed us to elucidate what chemical processes used in the hot cells during reprocessing of power reactor and identify other legacy chemicals used in hot cell operations which predate the CETE process.

  8. The Effect of Age of Second Language Acquisition on the Representation and Processing of Second Language Words

    ERIC Educational Resources Information Center

    Silverberg, Stu; Samuel, Arthur G.

    2004-01-01

    In this study, the effects of second language (i.e., L2) proficiency and age of second language acquisition are assessed. Three types of bilinguals are compared: Early L2 learners, Late highly proficient L2 learners, and Late less proficient L2 learners. A lexical decision priming paradigm is used in which the critical trials consist of first…

  9. The Symbolic World of the Bilingual Child: Digressions on Language Acquisition, Culture and the Process of Thinking

    ERIC Educational Resources Information Center

    Nowak-Fabrykowski, Krystyna; Shkandrij, Miroslav

    2004-01-01

    In this paper we explore the relationship between language acquisition, and the construction of a symbolic world. According to Bowers (1989) language is a collection of patterns regulating social life. This conception is close to that of Symbolic Interactionists (Charon, 1989) who see society as made up of interacting individuals who are symbol…

  10. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  11. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health. PMID:24792566

  12. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  13. Microfluidic solutions enabling continuous processing and monitoring of biological samples: A review.

    PubMed

    Karle, Marc; Vashist, Sandeep Kumar; Zengerle, Roland; von Stetten, Felix

    2016-07-27

    The last decade has witnessed tremendous advances in employing microfluidic solutions enabling Continuous Processing and Monitoring of Biological Samples (CPMBS), which is an essential requirement for the control of bio-processes. The microfluidic systems are superior to the traditional inline sensors due to their ability to implement complex analytical procedures, such as multi-step sample preparation, and enabling the online measurement of parameters. This manuscript provides a backgound review of microfluidic approaches employing laminar flow, hydrodynamic separation, acoustophoresis, electrophoresis, dielectrophoresis, magnetophoresis and segmented flow for the continuous processing and monitoring of biological samples. The principles, advantages and limitations of each microfluidic approach are described along with its potential applications. The challenges in the field and the future directions are also provided. PMID:27251944

  14. Evaluation of the effects of anatomic location, histologic processing, and sample size on shrinkage of skin samples obtained from canine cadavers.

    PubMed

    Reagan, Jennifer K; Selmic, Laura E; Garrett, Laura D; Singh, Kuldeep

    2016-09-01

    OBJECTIVE To evaluate effects of anatomic location, histologic processing, and sample size on shrinkage of excised canine skin samples. SAMPLE Skin samples from 15 canine cadavers. PROCEDURES Elliptical samples of the skin, underlying subcutaneous fat, and muscle fascia were collected from the head, hind limb, and lumbar region of each cadaver. Two samples (10 mm and 30 mm) were collected at each anatomic location of each cadaver (one from the left side and the other from the right side). Measurements of length, width, depth, and surface area were collected prior to excision (P1) and after fixation in neutral-buffered 10% formalin for 24 to 48 hours (P2). Length and width were also measured after histologic processing (P3). RESULTS Length and width decreased significantly at all anatomic locations and for both sample sizes at each processing stage. Hind limb samples had the greatest decrease in length, compared with results for samples obtained from other locations, across all processing stages for both sample sizes. The 30-mm samples had a greater percentage change in length and width between P1 and P2 than did the 10-mm samples. Histologic processing (P2 to P3) had a greater effect on the percentage shrinkage of 10-mm samples. For all locations and both sample sizes, percentage change between P1 and P3 ranged from 24.0% to 37.7% for length and 18.0% to 22.8% for width. CONCLUSIONS AND CLINICAL RELEVANCE Histologic processing, anatomic location, and sample size affected the degree of shrinkage of a canine skin sample from excision to histologic assessment. PMID:27580116

  15. Impact of processing method on recovery of bacteria from wipes used in biological surface sampling.

    PubMed

    Downey, Autumn S; Da Silva, Sandra M; Olson, Nathan D; Filliben, James J; Morrow, Jayne B

    2012-08-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense.

  16. Leukopak PBMC Sample Processing for Preparing Quality Control Material to Support Proficiency Testing Programs

    PubMed Central

    Garcia, Ambrosia; Keinonen, Sarah; Sanchez, Ana M.; Ferrari, Guido; Denny, Thomas N.; Moody, M. Anthony

    2014-01-01

    External proficiency testing programs designed to evaluate the performance of end-point laboratories involved in vaccine and therapeutic clinical trials form an important part of clinical trial quality assurance. Good Clinical Laboratory Practice (GCLP) guidelines recommend both assay validation and proficiency testing for assays being used in clinical trials, and such testing is facilitated by the availability of large numbers of well-characterized test samples. These samples can be distributed to laboratories participating in these programs and allow monitoring of laboratory performance over time and among participating sites when results are obtained with samples derived from a large master set. The leukapheresis procedure provides an ideal way to collect samples from participants that can meet the required number of cells to support these activities. The collection and processing of leukapheresis samples requires tight coordination between the clinical and laboratory teams to collect, process, and cryopreserve large number of samples within the established ideal time of ≤8 hours. Here, we describe our experience with a leukapheresis cryopreseration program that has been able to preserve the functionality of cellular subsets and that provides the sample numbers necessary to run an external proficiency testing program. PMID:24928650

  17. Energy from true in situ processing of Antrim shale: Sampling and analytical systems

    NASA Astrophysics Data System (ADS)

    Pihlaja, R. K.

    1980-08-01

    Reliable on-line analysis of production gas composition is fundamental to the success of an in situ extraction experiment in Antrim shale. An automted sampling and analysis system designed to meet this need provided high quality analytical data for three extraction trials without a single day when no data were taken. The production gas samples were routinely analyzed by both gas chromatography and a bank of continuous on-line process gas analyzers. The process gas analyzers measured CO, CO2, total hydrocarbons and O2 continuously. The process gas analyzers were shown to be especially well suited for this application because of their fast response. The GC data provided itemized composition details as well as the independent check of process analyzer data. The combination of the two analytical techniques and automated data handling yielded a versatile and powerful system.

  18. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  19. Faulting processes in active faults - Evidences from TCDP and SAFOD drill core samples

    SciTech Connect

    Janssen, C.; Wirth, R.; Wenk, H. -R.; Morales, L.; Naumann, R.; Kienast, M.; Song, S. -R.; Dresen, G.

    2014-08-20

    The microstructures, mineralogy and chemistry of representative samples collected from the cores of the San Andreas Fault drill hole (SAFOD) and the Taiwan Chelungpu-Fault Drilling project (TCDP) have been studied using optical microscopy, TEM, SEM, XRD and XRF analyses. SAFOD samples provide a transect across undeformed host rock, the fault damage zone and currently active deforming zones of the San Andreas Fault. TCDP samples are retrieved from the principal slip zone (PSZ) and from the surrounding damage zone of the Chelungpu Fault. Substantial differences exist in the clay mineralogy of SAFOD and TCDP fault gouge samples. Amorphous material has been observed in SAFOD as well as TCDP samples. In line with previous publications, we propose that melt, observed in TCDP black gouge samples, was produced by seismic slip (melt origin) whereas amorphous material in SAFOD samples was formed by comminution of grains (crush origin) rather than by melting. Dauphiné twins in quartz grains of SAFOD and TCDP samples may indicate high seismic stress. The differences in the crystallographic preferred orientation of calcite between SAFOD and TCDP samples are significant. Microstructures resulting from dissolution–precipitation processes were observed in both faults but are more frequently found in SAFOD samples than in TCDP fault rocks. As already described for many other fault zones clay-gouge fabrics are quite weak in SAFOD and TCDP samples. Clay-clast aggregates (CCAs), proposed to indicate frictional heating and thermal pressurization, occur in material taken from the PSZ of the Chelungpu Fault, as well as within and outside of the SAFOD deforming zones, indicating that these microstructures were formed over a wide range of slip rates.

  20. A novel approach to process carbonate samples for radiocarbon measurements with helium carrier gas

    NASA Astrophysics Data System (ADS)

    Wacker, L.; Fülöp, R.-H.; Hajdas, I.; Molnár, M.; Rethemeyer, J.

    2013-01-01

    Most laboratories prepare carbonates samples for radiocarbon analysis by acid decomposition in evacuated glass tubes and subsequent reduction of the evolved CO2 to graphite in self-made reduction manifolds. This process is time consuming and labor intensive. In this work, we have tested a new approach for the preparation of carbonate samples, where any high-vacuum system is avoided and helium is used as a carrier gas. The liberation of CO2 from carbonates with phosphoric acid is performed in a similar way as it is often done in stable isotope ratio mass spectrometry where CO2 is released with acid in septum sealed tube under helium atmosphere. The formed CO2 is later flushed in a helium flow by means of a double-walled needle mounted from the tubes to the zeolite trap of the automated graphitization equipment (AGE). It essentially replaces the elemental analyzer normally used for the combustion of organic samples. The process can be fully automated from sampling the released CO2 in the septum-sealed tubes with a commercially available auto-sampler to the graphitization with the automated graphitization. The new method yields in low sample blanks of about 50000 years. Results of processed reference materials (IAEA-C2, FIRI-C) are in agreement with their consensus values.

  1. Expanding the application of the tablet processing workstation to support the sample preparation of oral suspensions.

    PubMed

    Opio, Alex Manuel; Nickerson, Beverly; Xue, Gang; Warzeka, John; Norris, Ken

    2011-06-01

    Sample preparation is the most time-consuming part of the analytical method for powder for oral suspension (POS) assay, purity, and preservative analysis, as this involves multiple dilution and filtration steps. The Tablet Processing Workstation (TPW) was used to automate the sample preparation of a POS formulation. Although the TPW is typically used to automate the preparation of solid oral dosage forms and powders, it contains all of the necessary components to perform POS sample preparation. The TPW exhibited acceptable repeatability in testing 3 lots using 10 replicate preparations per lot. Acceptable linearity of the drug and preservative in the presence of excipients was demonstrated over the range corresponding to 50-150% of intent. Accuracy showed suitable recoveries for all points evaluated. TPW results were shown to correlate to results obtained with the manual method. The TPW method was used to prepare samples in support of manufacturing scale-up efforts. With the efficiencies gained using the TPW, it was possible to analyze a large number of samples generated during process development activities for the POS formulation with minimal human intervention. The extensive data enabled trending of the manufacturing development runs and helped to identify optimization strategies for the process.

  2. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling.

    PubMed

    Boone, Eric J; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B; Stirm, Brian H; Pratt, Kerri A

    2015-07-21

    Cloudwater and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry (HRMS) with nanospray desorption electrospray ionization (nano-DESI) and direct infusion electrospray ionization (ESI) were utilized to compare the organic composition of the particle and cloudwater samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloudwater, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloudwater samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloudwater when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol.

  3. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling

    SciTech Connect

    Boone, Eric J.; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B.; Stirm, Brian H.; Pratt, Kerri A.

    2015-07-21

    Cloud water and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry with nanospray desorption electrospray ionization and direct infusion electrospray ionization were utilized to compare the organic composition of the particle and cloud water samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloud water, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloud water samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloud water when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol.

  4. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling.

    PubMed

    Boone, Eric J; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B; Stirm, Brian H; Pratt, Kerri A

    2015-07-21

    Cloudwater and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry (HRMS) with nanospray desorption electrospray ionization (nano-DESI) and direct infusion electrospray ionization (ESI) were utilized to compare the organic composition of the particle and cloudwater samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloudwater, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloudwater samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloudwater when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol. PMID:26068538

  5. A real-time imaging system for rapid processing of radioactive DNA samples

    NASA Astrophysics Data System (ADS)

    McGann, W. J.; McConchie, L.; Entine, G.

    1990-12-01

    A new, high-resolution nuclear-imaging detector system is described which substantially improves the speed of detection of radioactively labeled DNA samples. Ultimately this system will be made compatible with a fully automated DNA processing system to aid in the isolation and harvesting of DNA clones in the human genome.

  6. Influence of Sampling and Comparison Processes on the Development of Communication Effectiveness.

    ERIC Educational Resources Information Center

    Asher, Steven R.

    1972-01-01

    Research was conducted to determine the degree to which sampling vs. comparison processes account for age changes in referential communication. Children at three different grade levels communicated referents within related and unrelated word pairs. The word pairs used were from Cohen and Klein (1968). From their list of 30 related pairs, half were…

  7. Sampling and Hydrogeology of the Vadose Zone Beneath the 300 Area Process Ponds

    SciTech Connect

    Bjornstad, Bruce N.

    2004-08-31

    Four open pits were dug with a backhoe into the vadose zone beneath the former 300 Area Process Ponds in April 2003. Samples were collected about every 2 feet for physical, chemical, and/or microbiological characterization. This reports presents a stratigraphic and geohydrologic summary of the four excavations.

  8. Relation of Childhood Worry to Information-Processing Factors in an Ethnically Diverse Community Sample

    ERIC Educational Resources Information Center

    Suarez-Morales, Lourdes; Bell, Debora

    2006-01-01

    This study examined information-processing variables in relation to worry in a sample of 292 fifth-grade children from Caucasian, African American, and Hispanic backgrounds. Results revealed that worry was related to threat interpretations for hypothetical situations and, when stress level was not controlled, to higher estimates of future…

  9. Elongated styloid process in a temporomandibular disorder sample: prevalence and treatment outcome.

    PubMed

    Zaki, H S; Greco, C M; Rudy, T E; Kubinski, J A

    1996-04-01

    An elongated styloid process is an anatomic anomaly present in 2% to 30% of adults; it is occasionally associated with pain. Its prevalence among patients with classic temporomandibular disorder pain symptoms is unknown. The effect of conservative treatment on patients who have symptoms of temporomandibular disorders and an elongated styloid process is also unknown. The objectives of this study were to determine the prevalence of the elongated styloid process in a sample of patients with temporomandibular disorders and to compare patients with and without the elongated styloid process on initial presenting signs and symptoms and treatment outcome. A total of 100 panoramic radiographs of patients with symptomatic temporomandibular disorders were examined to ascertain the presence or absence of an elongated styloid process. All patients participated in a conservative treatment program of biofeedback and stress management and a flat-plane intraoral appliance. Initial symptoms and treatment outcome of patients with and without an elongated styloid process were compared by use of multivariate analysis of variance on several oral-paraoral and psychosocial-behavioral methods. The prevalence of an elongated styloid process in this clinic sample of temporomandibular disorders was 27%. The patients with or without an elongated styloid process were not significantly different in pretreatment symptoms, and both groups exhibited substantial treatment gains. However, patients with an elongated styloid process showed significantly less improvement on unassisted mandibular opening without pain than did patients who did not have an elongated styloid process. This suggests that an elongated styloid process may place structural limitations on pain-free maximum mandibular opening. The results support conservative management of patients with symptoms of temporomandibular disorders when an elongated styloid process is present.

  10. Proposal for field sampling of plants and processing in the lab for environmental metabolic fingerprinting

    PubMed Central

    2010-01-01

    Background Samples for plant metabolic fingerprinting are prepared generally by metabolism quenching, grinding of plant material and extraction of metabolites in solvents. Further concentration and derivatisation steps follow in dependence of the sample nature and the available analytical platform. For plant material sampled in the field, several methods are not applicable, such as, e.g., collection in liquid nitrogen. Therefore, a protocol was established for sample pre-treatment, grinding, extraction and storage, which can be used for analysis of field-collected plant material, which is further processed in the laboratory. Ribwort plantain (Plantago lanceolata L., Plantaginaceae) was used as model plant. The quality criteria for method suitability were high reproducibility, extraction efficiency and handling comfort of each subsequent processing step. Results Highest reproducibility of results was achieved by sampling fresh plant material in a solvent mixture of methanol:dichloromethane (2:1), crushing the tissue with a hand-held disperser and storing the material until further processing. In the laboratory the material was extracted threefold at different pH. The gained extracts were separated with water (2:1:1 methanol:dichloromethane:water) and the aqueous phases used for analysis by LC-MS, because the polar metabolites were in focus. Chromatograms were compared by calculating a value Ξ for similarities. Advantages and disadvantages of different sample pre-treatment methods, use of solvents and solvent mixtures, influence of pH, extraction frequency and duration, and storing temperature are discussed with regard to the quality criteria. Conclusions The proposed extraction protocol leads to highly reproducible metabolic fingerprints and allows optimal handling of field-collected plant material and further processing in the laboratory, which is demonstrated for an exemplary field data-set. Calculation of Ξ values is a useful tool to judge similarities between

  11. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings

    PubMed Central

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-01-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/− 2°C following the ramp up. The system is demonstrated to provide linear results between 104 and 108 CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636

  12. Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts.

    PubMed

    Brønd, Jan Christian; Arvidsson, Daniel

    2016-02-01

    ActiGraph acceleration data are processed through several steps (including band-pass filtering to attenuate unwanted signal frequencies) to generate the activity counts commonly used in physical activity research. We performed three experiments to investigate the effect of sampling frequency on the generation of activity counts. Ideal acceleration signals were produced in the MATLAB software. Thereafter, ActiGraph GT3X+ monitors were spun in a mechanical setup. Finally, 20 subjects performed walking and running wearing GT3X+ monitors. Acceleration data from all experiments were collected with different sampling frequencies, and activity counts were generated with the ActiLife software. With the default 30-Hz (or 60-Hz, 90-Hz) sampling frequency, the generation of activity counts was performed as intended with 50% attenuation of acceleration signals with a frequency of 2.5 Hz by the signal frequency band-pass filter. Frequencies above 5 Hz were eliminated totally. However, with other sampling frequencies, acceleration signals above 5 Hz escaped the band-pass filter to a varied degree and contributed to additional activity counts. Similar results were found for the spinning of the GT3X+ monitors, although the amount of activity counts generated was less, indicating that raw data stored in the GT3X+ monitor is processed. Between 600 and 1,600 more counts per minute were generated with the sampling frequencies 40 and 100 Hz compared with 30 Hz during running. Sampling frequency affects the processing of ActiGraph acceleration data to activity counts. Researchers need to be aware of this error when selecting sampling frequencies other than the default 30 Hz.

  13. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings.

    PubMed

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-05-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/- 2°C following the ramp up. The system is demonstrated to provide linear results between 10(4) and 10(8) CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection.

  14. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings.

    PubMed

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-05-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/- 2°C following the ramp up. The system is demonstrated to provide linear results between 10(4) and 10(8) CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636

  15. Pesticide-sampling equipment, sample-collection and processing procedures, and water-quality data at Chicod Creek, North Carolina, 1992

    USGS Publications Warehouse

    Manning, T.K.; Smith, K.E.; Wood, C.D.; Williams, J.B.

    1994-01-01

    Water-quality samples were collected from Chicod Creek in the Coastal Plain Province of North Carolina during the summer of 1992 as part of the U.S. Geological Survey's National Water-Quality Assessment Program. Chicod Creek is in the Albemarle-Pamlico drainage area, one of four study units designated to test equipment and procedures for collecting and processing samples for the solid-phase extraction of selected pesticides, The equipment and procedures were used to isolate 47 pesticides, including organonitrogen, carbamate, organochlorine, organophosphate, and other compounds, targeted to be analyzed by gas chromatography/mass spectrometry. Sample-collection and processing equipment equipment cleaning and set-up procedures, methods pertaining to collecting, splitting, and solid-phase extraction of samples, and water-quality data resulting from the field test are presented in this report Most problems encountered during this intensive sampling exercise were operational difficulties relating to equipment used to process samples.

  16. Towards Routine Backside SIMS Sample Preparation for Efficient Support of Advanced IC Process Development

    NASA Astrophysics Data System (ADS)

    Hopstaken, M. J. P.; Cabral, C.; Pfeiffer, D.; Molella, C.; Ronsheim, P.

    2009-09-01

    Backside Secondary Ion Mass Spectrometry (SIMS) profiling is a seemingly simple option to circumvent commonly observed depth resolution degradation in conventional front-side SIMS. However, large practical barriers in backside sample preparation prohibit a wider and more routine use of backside SIMS. Here, we explore the use of XeF2 dry etching instead of wet etching for removal of the residual Si-substrate. The former process is essentially isotropic with similar etch rates for the different crystallographic orientations and highly selective towards the dense thermal oxide (BOX). This eliminates the need for high-precision polishing of individual samples, reducing the substrate removal to a few coarse and relatively rapid polishing steps only. Moreover, XeF2 etching can be performed in unattended fashion and simultaneously on multiple samples, greatly increasing volume and turn-around time for backside sample preparation. Here we have explained the different practical aspects and demonstrated the feasibility of this novel approach for backside preparation for different front-end (S/D contact silicide metal, high-k metal gate) and back-end (ECD-Copper) of line applications. In conclusion, availability of a robust and reliable procedure for backside SIMS sample preparation with rapid turn-around is highly beneficial for a more efficient analytical support of advanced IC process development.

  17. Statistical Review of Data from DWPF's Process Samples for Batches 19 Through 30

    SciTech Connect

    Edwards, T.B.

    1999-04-06

    The measurements derived from samples taken during the processing of batches 19 through 30 at the Defense Waste Processing Facility (DWPF) affords an opportunity for review and comparisons. This report has looked at some of the statistics from these data. Only the data reported by the DWPF lab (that is, the data provided by the lab as representative of the samples taken) are available for this analysis. In some cases, the sample results reported may be a subset of the sample results generated by the analytical procedures. A thorough assessment of the DWPF lab's analytical procedures would require the complete set of data. Thus, the statistics reported here, specifically, as they relate to analytical uncertainties, are limited to the reported data for these samples, A fell for the consistency of the incoming slurry is the estimation of the components of variation for the Sludge Receipt and Adjustment Tank (SRAT) receipts. In general, for all of the vessels, the data from batches after 21 show smaller batch-to-batch variation than the data from all the batches. The relative contributions of batch-to-batch versus residual, which includes analytical, are presented in these analyses.

  18. Technical note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, Tim; Bach, Lennart T.; Czerny, Jan; Riebesell, Ulf

    2016-05-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial and temporal decoupling between particle formation in the surface ocean and particle collection in sediment traps at depth often handicaps reconciliation of production and sedimentation even within the euphotic zone. Pelagic mesocosms are restricted to the surface ocean, but have the advantage of being closed systems and are therefore ideally suited to studying how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples for biogeochemical analysis. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently separated from bulk seawater by passive settling, centrifugation or flocculation with ferric chloride, and we discuss the advantages and efficiencies of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranged from fine to coarse silt (2-63 µm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements, and even at very low particle fluxes we were able to get a detailed insight into various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate the processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  19. Sequential sampling model for multiattribute choice alternatives with random attention time and processing order.

    PubMed

    Diederich, Adele; Oswald, Peter

    2014-01-01

    A sequential sampling model for multiattribute binary choice options, called multiattribute attention switching (MAAS) model, assumes a separate sampling process for each attribute. During the deliberation process attention switches from one attribute consideration to the next. The order in which attributes are considered as well for how long each attribute is considered-the attention time-influences the predicted choice probabilities and choice response times. Several probability distributions for the attention time with different variances are investigated. Depending on the time and order schedule the model predicts a rich choice probability/choice response time pattern including preference reversals and fast errors. Furthermore, the difference between finite and infinite decision horizons for the attribute considered last is investigated. For the former case the model predicts a probability p 0 > 0 of not deciding within the available time. The underlying stochastic process for each attribute is an Ornstein-Uhlenbeck process approximated by a discrete birth-death process. All predictions are also true for the widely applied Wiener process. PMID:25249963

  20. Effects of pre-analytical processes on blood samples used in metabolomics studies.

    PubMed

    Yin, Peiyuan; Lehmann, Rainer; Xu, Guowang

    2015-07-01

    Every day, analytical and bio-analytical chemists make sustained efforts to improve the sensitivity, specificity, robustness, and reproducibility of their methods. Especially in targeted and non-targeted profiling approaches, including metabolomics analysis, these objectives are not easy to achieve; however, robust and reproducible measurements and low coefficients of variation (CV) are crucial for successful metabolomics approaches. Nevertheless, all efforts from the analysts are in vain if the sample quality is poor, i.e. if preanalytical errors are made by the partner during sample collection. Preanalytical risks and errors are more common than expected, even when standard operating procedures (SOP) are used. This risk is particularly high in clinical studies, and poor sample quality may heavily bias the CV of the final analytical results, leading to disappointing outcomes of the study and consequently, although unjustified, to critical questions about the analytical performance of the approach from the partner who provided the samples. This review focuses on the preanalytical phase of liquid chromatography-mass spectrometry-driven metabolomics analysis of body fluids. Several important preanalytical factors that may seriously affect the profile of the investigated metabolome in body fluids, including factors before sample collection, blood drawing, subsequent handling of the whole blood (transportation), processing of plasma and serum, and inadequate conditions for sample storage, will be discussed. In addition, a detailed description of latent effects on the stability of the blood metabolome and a suggestion for a practical procedure to circumvent risks in the preanalytical phase will be given.

  1. SAUNA—a system for automatic sampling, processing, and analysis of radioactive xenon

    NASA Astrophysics Data System (ADS)

    Ringbom, A.; Larson, T.; Axelsson, A.; Elmgren, K.; Johansson, C.

    2003-08-01

    A system for automatic sampling, processing, and analysis of atmospheric radioxenon has been developed. From an air sample of about 7 m3 collected during 12 h, 0.5 cm3 of xenon is extracted, and the atmospheric activities from the four xenon isotopes 133Xe, 135Xe, 131mXe, and 133mXe are determined with a beta-gamma coincidence technique. The collection is performed using activated charcoal and molecular sieves at ambient temperature. The sample preparation and quantification are performed using preparative gas chromatography. The system was tested under routine conditions for a 5-month period, with average minimum detectable concentrations below 1 mBq/ m3 for all four isotopes.

  2. Minimal, encapsulated proteomic-sample processing applied to copy-number estimation in eukaryotic cells.

    PubMed

    Kulak, Nils A; Pichler, Garwin; Paron, Igor; Nagaraj, Nagarjuna; Mann, Matthias

    2014-03-01

    Mass spectrometry (MS)-based proteomics typically employs multistep sample-preparation workflows that are subject to sample contamination and loss. We report an in-StageTip method for performing sample processing, from cell lysis through elution of purified peptides, in a single, enclosed volume. This robust and scalable method largely eliminates contamination or loss. Peptides can be eluted in several fractions or in one step for single-run proteome analysis. In one day, we obtained the largest proteome coverage to date for budding and fission yeast, and found that protein copy numbers in these cells were highly correlated (R(2) = 0.78). Applying the in-StageTip method to quadruplicate measurements of a human cell line, we obtained copy-number estimates for 9,667 human proteins and observed excellent quantitative reproducibility between replicates (R(2) = 0.97). The in-StageTip method is straightforward and generally applicable in biological or clinical applications.

  3. An evaluation of ventilator-associated pneumonia process measure sampling strategies in a surgical ICU.

    PubMed

    Rawat, Nishi; Yang, Ting; Speck, Kathleen; Helzer, Jennifer; Barenski, Cathleen; Berenholtz, Sean

    2014-01-01

    Ventilator-associated pneumonia (VAP) is common, lethal, and expensive. Little is known about optimal strategies to evaluate process measures for VAP prevention. The authors conducted a prospective study of different sampling strategies for evaluating head of bed (HOB) elevation and oral care. There was no significant difference between morning and evening shift HOB elevation compliance rates (P = .47). If oral care was performed at least once during a 12-hour shift, there was an 87% probability that it also was performed at least twice. If oral care was performed at least twice during a 12-hour shift, then there was a 93% probability that chlorhexidine oral care was performed at least once. The results of this study suggest that sampling HOB elevation twice as compared with once daily is unlikely to change the estimate of performance, oral care need not be frequently sampled, and high oral care compliance may predict chlorhexidine oral care compliance.

  4. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  5. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion method was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U data, which is also

  6. Evaluation of the DWPF chemical process cell sample condenser in the integrated DWPF melter system

    SciTech Connect

    Zamecnik, J.R.

    1992-05-15

    An on-line Analysis system for hydrogen is being added to the Chemical Processing Cell (CPC) in DWPF to ensure that the process does not operate above the lower flammable limit (LFL). The method chosen to measure hydrogen during cold runs is gas chromatography (GC). In order for the GCs to analyze the offgas exiting the SRAT and SME condensers, an additional condenser is required to reduce the dew point of tho sample to below the lowest ambient temperature expected so that no liquid water will enter the GCs. This temperature was chosen to be 10{degrees}C.

  7. Evaluation of the DWPF chemical process cell sample condenser in the integrated DWPF melter system

    SciTech Connect

    Zamecnik, J.R.

    1992-05-15

    An on-line Analysis system for hydrogen is being added to the Chemical Processing Cell (CPC) in DWPF to ensure that the process does not operate above the lower flammable limit (LFL). The method chosen to measure hydrogen during cold runs is gas chromatography (GC). In order for the GCs to analyze the offgas exiting the SRAT and SME condensers, an additional condenser is required to reduce the dew point of tho sample to below the lowest ambient temperature expected so that no liquid water will enter the GCs. This temperature was chosen to be 10[degrees]C.

  8. Data Acquisition for Modular Biometric Monitoring System

    NASA Technical Reports Server (NTRS)

    Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor); Grodsinsky, Carlos M. (Inventor)

    2014-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.

  9. Towards a Theory of Sampled-Data Piecewise-Deterministic Markov Processes

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Gonzalez, Oscar R.; Gray, W. Steven

    2006-01-01

    The analysis and design of practical control systems requires that stochastic models be employed. Analysis and design tools have been developed, for example, for Markovian jump linear continuous and discrete-time systems, piecewise-deterministic processes (PDP's), and general stochastic hybrid systems (GSHS's). These model classes have been used in many applications, including fault tolerant control and networked control systems. This paper presents initial results on the analysis of a sampled-data PDP representation of a nonlinear sampled-data system with a jump linear controller. In particular, it is shown that the state of the sampled-data PDP satisfies the strong Markov property. In addition, a relation between the invariant measures of a sampled-data system driven by a stochastic process and its associated discrete-time representation are presented. As an application, when the plant is linear with no external input, a sufficient testable condition for the convergence in distribution to the invariant delta Dirac measure is given.

  10. Processes in scientific workflows for information seeking related to physical sample materials

    NASA Astrophysics Data System (ADS)

    Ramdeen, S.

    2014-12-01

    The majority of State Geological Surveys have repositories containing cores, cuttings, fossils or other physical sample material. State surveys maintain these collections to support their own research as well as the research conducted by external users from other organizations. This includes organizations such as government agencies (state and federal), academia, industry and the public. The preliminary results presented in this paper will look at the research processes of these external users. In particular: how they discover, access and use digital surrogates, which they use to evaluate and access physical items in these collections. Data such as physical samples are materials that cannot be completely replaced with digital surrogates. Digital surrogates may be represented as metadata, which enable discovery and ultimately access to these samples. These surrogates may be found in records, databases, publications, etc. But surrogates do not completely prevent the need for access to the physical item as they cannot be subjected to chemical testing and/or other similar analysis. The goal of this research is to document the various processes external users perform in order to access physical materials. Data for this study will be collected by conducting interviews with these external users. During the interviews, participants will be asked to describe the workflow that lead them to interact with state survey repositories, and what steps they took afterward. High level processes/categories of behavior will be identified. These processes will be used in the development of an information seeking behavior model. This model may be used to facilitate the development of management tools and other aspects of cyberinfrastructure related to physical samples.

  11. A Variable Sampling Interval Synthetic Xbar Chart for the Process Mean.

    PubMed

    Lee, Lei Yong; Khoo, Michael Boon Chong; Teh, Sin Yin; Lee, Ming Ha

    2015-01-01

    The usual practice of using a control chart to monitor a process is to take samples from the process with fixed sampling interval (FSI). In this paper, a synthetic X control chart with the variable sampling interval (VSI) feature is proposed for monitoring changes in the process mean. The VSI synthetic X chart integrates the VSI X chart and the VSI conforming run length (CRL) chart. The proposed VSI synthetic X chart is evaluated using the average time to signal (ATS) criterion. The optimal charting parameters of the proposed chart are obtained by minimizing the out-of-control ATS for a desired shift. Comparisons between the VSI synthetic X chart and the existing X, synthetic X, VSI X and EWMA X charts, in terms of ATS, are made. The ATS results show that the VSI synthetic X chart outperforms the other X type charts for detecting moderate and large shifts. An illustrative example is also presented to explain the application of the VSI synthetic X chart.

  12. A Variable Sampling Interval Synthetic Xbar Chart for the Process Mean

    PubMed Central

    Lee, Lei Yong; Khoo, Michael Boon Chong; Teh, Sin Yin; Lee, Ming Ha

    2015-01-01

    The usual practice of using a control chart to monitor a process is to take samples from the process with fixed sampling interval (FSI). In this paper, a synthetic X¯ control chart with the variable sampling interval (VSI) feature is proposed for monitoring changes in the process mean. The VSI synthetic X¯ chart integrates the VSI X¯ chart and the VSI conforming run length (CRL) chart. The proposed VSI synthetic X¯ chart is evaluated using the average time to signal (ATS) criterion. The optimal charting parameters of the proposed chart are obtained by minimizing the out-of-control ATS for a desired shift. Comparisons between the VSI synthetic X¯ chart and the existing X¯, synthetic X¯, VSI X¯ and EWMA X¯ charts, in terms of ATS, are made. The ATS results show that the VSI synthetic X¯ chart outperforms the other X¯ type charts for detecting moderate and large shifts. An illustrative example is also presented to explain the application of the VSI synthetic X¯ chart. PMID:25951141

  13. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  14. Evaluation of SRAT Sampling Data in Support of a Six Sigma Yellow Belt Process Improvement Project

    SciTech Connect

    Edwards, Thomas B.

    2005-06-01

    As part of the Six Sigma continuous improvement initiatives at the Defense Waste Processing Facility (DWPF), a Yellow Belt team was formed to evaluate the frequency and types of samples required for the Sludge Receipt and Adjustment Tank (SRAT) receipt in the DWPF. The team asked, via a technical task request, that the Statistical Consulting Section (SCS), in concert with the Immobilization Technology Section (ITS) (both groups within the Savannah River National Laboratory (SRNL)), conduct a statistical review of recent SRAT receipt results to determine if there is enough consistency in these measurements to allow for less frequent sampling. As part of this review process, key decisions made by DWPF Process Engineering that are based upon the SRAT sample measurements are outlined in this report. For a reduction in SRAT sampling to be viable, these decisions must not be overly sensitive to the additional variation that will be introduced as a result of such a reduction. Measurements from samples of SRAT receipt batches 314 through 323 were reviewed as part of this investigation into the frequency of SRAT sampling. The associated acid calculations for these batches were also studied as part of this effort. The results from this investigation showed no indication of a statistically significant relationship between the tank solids and the acid additions for these batches. One would expect that as the tank solids increase there would be a corresponding increase in acid requirements. There was, however, an indication that the predicted reduction/oxidation (REDOX) ratio (the ratio of Fe{sup 2+} to the total Fe in the glass product) that was targeted by the acid calculations based on the SRAT receipt samples for these batches was on average 0.0253 larger than the predicted REDOX based upon Slurry Mix Evaporator (SME) measurements. This is a statistically significant difference (at the 5% significance level), and the study also suggested that the difference was due to

  15. Assessment of toxic metals in raw and processed milk samples using electrothermal atomic absorption spectrophotometer.

    PubMed

    Kazi, Tasneem Gul; Jalbani, Nusrat; Baig, Jameel Ahmed; Kandhro, Ghulam Abbas; Afridi, Hassan Imran; Arain, Mohammad Balal; Jamali, Mohammad Khan; Shah, Abdul Qadir

    2009-09-01

    Milk and dairy products have been recognized all over the world for their beneficial influence on human health. The levels of toxic metals (TMs) are an important component of safety and quality of milk. A simple and efficient microwave assisted extraction (MAE) method has been developed for the determination of TMs (Al, Cd, Ni and Pb), in raw and processed milk samples. A Plackett-Burman experimental design and 2(3)+star central composite design, were applied in order to determine the optimum conditions for MAE. Concentrations of TMs were measured by electrothermal atomic absorption spectrometry. The accuracy of the optimized procedure was evaluated by standard addition method and conventional wet acid digestion method (CDM), for comparative purpose. No significant differences were observed (P>0.05), when comparing the values obtained by the proposed MAE method and CDM (paired t-test). The average relative standard deviation of the MAE method varied between 4.3% and 7.6% based on analyte (n=6). The proposed method was successfully applied for the determination of understudy TMs in milk samples. The results of raw and processed milk indicated that environmental conditions and manufacturing processes play a key role in the distribution of toxic metals in raw and processed milk.

  16. Delayed matching-to-sample: A tool to assess memory and other cognitive processes in pigeons.

    PubMed

    Zentall, Thomas R; Smith, Aaron P

    2016-02-01

    Delayed matching-to-sample is a versatile task that has been used to assess the nature of animal memory. Although once thought to be a relatively passive process, matching research has demonstrated considerable flexibility in how animals actively represent events in memory. But delayed matching can also demonstrate how animals fail to maintain representations in memory when they are cued that they will not be tested (directed forgetting) and how the outcome expected can serve as a choice cue. When pigeons have shown divergent retention functions following training without a delay, it has been taken as evidence of the use of a single-code/default coding strategy but in many cases an alternative account may be involved. Delayed matching has also been used to investigate equivalence learning (how animals represent stimuli when they learn that the same comparison response is correct following the presentation of two different samples) and to test for metamemory (the ability of pigeons to indicate that they understand what they know) by allowing animals to decline to be tested when they are uncertain that they remember a stimulus. How animals assess the passage of time has also been studied using the matching task. And there is evidence that when memory for the sample is impaired by a delay, rather than use the probability of being correct for choice of each of the comparison stimuli, pigeons tend to choose based on the overall sample frequency (base-rate neglect). Finally, matching has been used to identify natural color categories as well as dimensional categories in pigeons. Overall, matching to sample has provided an excellent methodology for assessing an assortment of cognitive processes in animals.

  17. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    SciTech Connect

    Tripp, J.; Smith, T.; Law, J.

    2013-07-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  18. Acquisition strategies

    SciTech Connect

    Zimmer, M.J.; Lynch, P.W. )

    1993-11-01

    Acquiring projects takes careful planning, research and consideration. Picking the right opportunities and avoiding the pitfalls will lead to a more valuable portfolio. This article describes the steps to take in evaluating an acquisition and what items need to be considered in an evaluation.

  19. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  20. Planetary protection, legal ambiguity and the decision making process for Mars sample return

    NASA Technical Reports Server (NTRS)

    Race, M. S.

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  1. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions. PMID:11538983

  2. Seabed observation & sampling system

    USGS Publications Warehouse

    Blackwood, D.; Parolski, K.

    2001-01-01

    SEABOSS has proved to be a valuable addition to the USGS data-acquisition and processing field program. It has allowed researchers to collect high-quality images and seabed samples in a timely manner. It is a simple, dependable and trouble-free system with a track record of over 3,000 deployments. When used as part of the USGS seafloor mapping acquisition, processing, and ground-truth program, SEABOSS has been invaluable in providing information quickly and efficiently, with a minimum of downtime. SEABOSS enables scientists to collect high-quality images and samples of the seabed, essential to the study of sedimentary environments and biological habitats and to the interpretation of side-scan sonar and multibeam imagery, the most common tools for mapping the seabed.

  3. Laser Ablation Solid Sampling processes investigated usinginductively coupled plasma - atomic emission spectroscopy (ICP-AES)

    SciTech Connect

    Mao, X.L.; Ciocan, A.C.; Borisov, O.V.; Russo, R.E.

    1997-07-01

    The symbiotic relationship between laser ablation mechanismsand analytical performance using inductively coupled plasma-atomicemission spectroscopy are addressed in this work. For both cases, it isimportant to ensure that the ICP conditions (temperature and electronnumber density) are not effected by the ablated mass. By ensuring thatthe ICP conditions are constant, changes in spectral emission intensitywill be directly related to changes in laser ablation behavior. Mg ionicline to atomic line ratios and excitation temperature were measured tomonitor the ICP conditions during laser-ablation sample introduction. Thequantity of ablated mass depends on the laser pulse duration andwavelength. The quantity of mass removed per unit energy is larger whenablating with shorter laser wavelengths and pulses. Preferential ablationof constituents from a multicomponent sample was found to depend on thelaser beam properties (wavelength and pulse duration). Fornanosecond-pulsed lasers, thermal vaporization dominates the ablationprocess. For picosecond-pulsed lasers, a non-thermal mechanism appears todominate the ablation process. This work will describe the mass ablationbehavior during nanosecond and picosecond laser sampling into the ICP.The behavior of the ICP under mass loading conditions is firstestablished, followed by studies of the ablation behavior at variouspower densities. A thermal vaporization model is used to explainnanosecond ablation, and a possible non-thermal mechanism is proposed toexplain preferential ablation of Zn and Cu from brass samples duringpicosecond ablation.

  4. Microwave irradiation for shortening the processing time of samples of flagellated bacteria for scanning electron microscopy.

    PubMed

    Hernández-Chavarría, Francisco

    2004-01-01

    Microwave irradiation (MWI) has been applied to the development of rapid methods to process biological samples for scanning electron microscopy (SEM). In this paper we propose two simple and quick techniques for processing bacteria (Proteus mirabilis and Vibrio mimicus) for SEM using MWI. In the simplest methodology, the bacteria were placed on a cover-glass, air-dried, and submitted to conductivity stain. The reagent used for the conductivity stain was the mordant of a light microscopy staining method (10 ml of 5% carbolic acid solution, 2 g of tannic acid, and 10 ml of saturated aluminum sulfate 12-H2O). In the second method the samples were double fixed (glutaraldehyde and then osmium), submitted to conductivity stain, dehydrated through a series of ethanol solutions of increasing concentration, treated with hexamethyldisilazine (HMDS), and dried at 35 degrees C for 5 minutes. In both methods the steps from fixation to treatment with HMDS were done under MWI for 2 minutes in an ice-water bath, in order to dissipate the heat generated by the MWI. Although both techniques preserve bacterial morphology adequately, the latter, technique showed the best preservation, including the appearance of flagella, and that process was completed in less than 2 hours at temperatures of MWI between 4 to 5 degrees C. PMID:17061527

  5. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, Joseph; May, Christopher P.; Pemberton, Bradley E.; Shinn, Jim; Sprague, Keith

    1999-01-01

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus.

  6. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, J.; May, C.P.; Pemberton, B.E.; Shinn, J.; Sprague, K.

    1999-03-30

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus. 8 figs.

  7. Microstructural and magnetic analysis of a superconducting foam and comparison with IG-processed bulk samples

    NASA Astrophysics Data System (ADS)

    Koblischka-Veneva, A.; Koblischka, M. R.; Ide, N.; Inoue, K.; Muralidhar, M.; Hauet, T.; Murakami, M.

    2016-03-01

    YBa2Cu3Oy (YBCO) foam samples show an open, porous foam structure, which may have benefits for many applications of high-T c superconductors. As the basic material of these foams is a pseudo-single crystalline material with the directional growth initiated by a seed crystal similar to standard melt-textured samples, the achieved texture of the YBCO is a very important parameter. We analyzed the local texture and grain orientation of the individual struts forming the foam by means of atomic force microscopy and electron backscatter diffraction (EBSD). Furthermore, the magnetic properties of a foam strut are evaluated by means of SQUID measurements, from which the flux pinning forces were determined. A scaling of the pinning forces in the temperature range between 60 K and 85 K was performed. These data and the details of the microstructure are compared to IG-processed, bulk material.

  8. Chemical process to separate iron oxides particles in pottery sample for EPR dating.

    PubMed

    Watanabe, S; Farias, T M B; Gennari, R F; Ferraz, G M; Kunzli, R; Chubaci, J F D

    2008-12-15

    Ancient potteries usually are made of the local clay material, which contains relatively high concentration of iron. The powdered samples are usually quite black, due to magnetite, and, although they can be used for thermoluminescene (TL) dating, it is easiest to obtain better TL reading when clearest natural or pre-treated sample is used. For electron paramagnetic resonance (EPR) measurements, the huge signal due to iron spin-spin interaction, promotes an intense interference overlapping any other signal in this range. Sample dating is obtained by dividing the radiation dose, determined by the concentration of paramagnetic species generated by irradiation, by the natural dose so as a consequence, EPR dating cannot be used, since iron signal do not depend on radiation dose. In some cases, the density separation method using hydrated solution of sodium polytungstate [Na6(H2W12O40).H2O] becomes useful. However, the sodium polytungstate is very expensive in Brazil; hence an alternative method for eliminating this interference is proposed. A chemical process to eliminate about 90% of magnetite was developed. A sample of powdered ancient pottery was treated in a mixture (3:1:1) of HCl, HNO(3) and H(2)O(2) for 4h. After that, it was washed several times in distilled water to remove all acid matrixes. The original black sample becomes somewhat clearer. The resulting material was analyzed by plasma mass spectrometry (ICP-MS), with the result that the iron content is reduced by a factor of about 9. In EPR measurements a non-treated natural ceramic sample shows a broad spin-spin interaction signal, the chemically treated sample presents a narrow signal in g=2.00 region, possibly due to a radical of (SiO(3))(3-), mixed with signal of remaining iron [M. Ikeya, New Applications of Electron Spin Resonance, World Scientific, Singapore, 1993, p. 285]. This signal increases in intensity under gamma-irradiation. However, still due to iron influence, the additive method yielded too

  9. Sequencing the hypervariable regions of human mitochondrial DNA using massively parallel sequencing: Enhanced data acquisition for DNA samples encountered in forensic testing.

    PubMed

    Davis, Carey; Peters, Dixie; Warshauer, David; King, Jonathan; Budowle, Bruce

    2015-03-01

    Mitochondrial DNA testing is a useful tool in the analysis of forensic biological evidence. In cases where nuclear DNA is damaged or limited in quantity, the higher copy number of mitochondrial genomes available in a sample can provide information about the source of a sample. Currently, Sanger-type sequencing (STS) is the primary method to develop mitochondrial DNA profiles. This method is laborious and time consuming. Massively parallel sequencing (MPS) can increase the amount of information obtained from mitochondrial DNA samples while improving turnaround time by decreasing the numbers of manipulations and more so by exploiting high throughput analyses to obtain interpretable results. In this study 18 buccal swabs, three different tissue samples from five individuals, and four bones samples from casework were sequenced at hypervariable regions I and II using STS and MPS. Sample enrichment for STS and MPS was PCR-based. Library preparation for MPS was performed using Nextera® XT DNA Sample Preparation Kit and sequencing was performed on the MiSeq™ (Illumina, Inc.). MPS yielded full concordance of base calls with STS results, and the newer methodology was able to resolve length heteroplasmy in homopolymeric regions. This study demonstrates short amplicon MPS of mitochondrial DNA is feasible, can provide information not possible with STS, and lays the groundwork for development of a whole genome sequencing strategy for degraded samples.

  10. Emotional processing in a non-clinical psychosis-prone sample.

    PubMed

    van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Larøi, Frank; Kahn, René S

    2004-06-01

    Symptoms of psychosis have been proposed to form part of a continuous distribution of experiences in the general population rather than being an all-or-nothing phenomenon. Indeed, schizotypal signs have been reported in subjects from non-clinical samples. Emotional processing has been documented to be deficient in schizophrenia. In the present study, we tested the hypothesis whether putatively psychosis-prone subjects would show abnormalities in emotion processing. Based on the extremes of Launay-Slade Hallucination Scale (LSHS) ratings of 200 undergraduate students, two groups of subjects (total N=40) were selected. All 40 participants filled in the Schizotypal Personality Questionnaire (SPQ). We compared both groups on an alexithymia questionnaire and on four behavioral emotional information processing tasks. Hallucination-proneness was associated with an increased subjective emotional arousal and fantasy-proneness. Although no differences between the high and low group were observed on three behavioral emotion processing tasks, on the affective word-priming task presentation of emotional stimuli was associated with longer reactions times to neutral words in high schizotypal subjects. Also, SPQ scores correlated with several emotion processing tasks. We conclude that these findings lend partial support to the hypothesis of continuity between symptoms characteristic of schizophrenia and psychosis-related phenomena in the normal population. PMID:15099609

  11. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  12. L. monocytogenes in a cheese processing facility: Learning from contamination scenarios over three years of sampling.

    PubMed

    Rückerl, I; Muhterem-Uyar, M; Muri-Klinger, S; Wagner, K-H; Wagner, M; Stessl, B

    2014-10-17

    The aim of this study was to analyze the changing patterns of Listeria monocytogenes contamination in a cheese processing facility manufacturing a wide range of ready-to-eat products. Characterization of L. monocytogenes isolates included genotyping by pulsed-field gel electrophoresis (PFGE) and multi-locus sequence typing (MLST). Disinfectant-susceptibility tests and the assessment of L. monocytogenes survival in fresh cheese were also conducted. During the sampling period between 2010 and 2013, a total of 1284 environmental samples were investigated. Overall occurrence rates of Listeria spp. and L. monocytogenes were 21.9% and 19.5%, respectively. Identical L. monocytogenes genotypes were found in the food processing environment (FPE), raw materials and in products. Interventions after the sampling events changed contamination scenarios substantially. The high diversity of globally, widely distributed L. monocytogenes genotypes was reduced by identifying the major sources of contamination. Although susceptible to a broad range of disinfectants and cleaners, one dominant L. monocytogenes sequence type (ST) 5 could not be eradicated from drains and floors. Significantly, intense humidity and steam could be observed in all rooms and water residues were visible on floors due to increased cleaning strategies. This could explain the high L. monocytogenes contamination of the FPE (drains, shoes and floors) throughout the study (15.8%). The outcome of a challenge experiment in fresh cheese showed that L. monocytogenes could survive after 14days of storage at insufficient cooling temperatures (8 and 16°C). All efforts to reduce L. monocytogenes environmental contamination eventually led to a transition from dynamic to stable contamination scenarios. Consequently, implementation of systematic environmental monitoring via in-house systems should either aim for total avoidance of FPE colonization, or emphasize a first reduction of L. monocytogenes to sites where

  13. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome

    PubMed Central

    Esteves, Ana I. S.; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viability and recovery in culture. We isolated 325 sponge bacteria from six specimens of Cymbastela concentrica and three specimens of Scopalina sp. These isolates were distributed over 37 different genera and 47 operational taxonomic units (defined at 97% 16S rRNA gene sequence identity). The cultivable bacterial community was highly specific to its sponge host and different media compositions yielded distinct microbial isolates. Around 97% of the isolates could be detected in the original sponge and represented a large but highly variable proportion (0.5–92% total abundance, depending on sponge species) of viable bacteria obtained after sample processing, as determined by propidium monoazide selective DNA modification of compromised cells. Our results show that the most abundant viable bacteria are also the most predominant groups found in cultivation, reflecting, to some extent, the relative abundances of the viable bacterial community, rather than the overall community estimated by direct molecular approaches. Cultivation is therefore shaped not only by the growth conditions provided, but also by the different cell viabilities of the bacteria that constitute the cultivation inoculum. These observations highlight the need to perform experiments to assess each method of sample processing for its accurate representation of the actual in situ bacterial community and its yield of viable cells. PMID:27242673

  14. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome.

    PubMed

    Esteves, Ana I S; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viability and recovery in culture. We isolated 325 sponge bacteria from six specimens of Cymbastela concentrica and three specimens of Scopalina sp. These isolates were distributed over 37 different genera and 47 operational taxonomic units (defined at 97% 16S rRNA gene sequence identity). The cultivable bacterial community was highly specific to its sponge host and different media compositions yielded distinct microbial isolates. Around 97% of the isolates could be detected in the original sponge and represented a large but highly variable proportion (0.5-92% total abundance, depending on sponge species) of viable bacteria obtained after sample processing, as determined by propidium monoazide selective DNA modification of compromised cells. Our results show that the most abundant viable bacteria are also the most predominant groups found in cultivation, reflecting, to some extent, the relative abundances of the viable bacterial community, rather than the overall community estimated by direct molecular approaches. Cultivation is therefore shaped not only by the growth conditions provided, but also by the different cell viabilities of the bacteria that constitute the cultivation inoculum. These observations highlight the need to perform experiments to assess each method of sample processing for its accurate representation of the actual in situ bacterial community and its yield of viable cells. PMID:27242673

  15. An approach for sampling solid heterogeneous waste at the Hanford Site waste receiving and processing and solid waste projects

    SciTech Connect

    Sexton, R.A.

    1993-03-01

    This paper addresses the problem of obtaining meaningful data from samples of solid heterogeneous waste while maintaining sample rates as low as practical. The Waste Receiving and Processing Facility, Module 1, at the Hanford Site in south-central Washington State will process mostly heterogeneous solid wastes. The presence of hazardous materials is documented for some packages and unknown for others. Waste characterization is needed to segregate the waste, meet waste acceptance and shipping requirements, and meet facility permitting requirements. Sampling and analysis are expensive, and no amount of sampling will produce absolute certainty of waste contents. A sampling strategy is proposed that provides acceptable confidence with achievable sampling rates.

  16. Directionally Solidified Aluminum - 7 wt% Silicon Alloys: Comparison of Earth and International Space Station Processed Samples

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N,; Tewari, Surendra; Rajamure, R. S.; Erdman, Robert; Poirier, David

    2012-01-01

    Primary dendrite arm spacings of Al-7 wt% Si alloy directionally solidified in low gravity environment of space (MICAST-6 and MICAST-7: Thermal gradient approx. 19 to 26 K/cm, Growth speeds varying from 5 to 50 microns/s show good agreement with the Hunt-Lu model. Primary dendrite trunk diameters of the ISS processed samples show a good fit with a simple analytical model based on Kirkwood s approach, proposed here. Natural convection, a) decreases primary dendrite arm spacing. b) appears to increase primary dendrite trunk diameter.

  17. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-07-25

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  18. Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing

    NASA Technical Reports Server (NTRS)

    Selzer, Robert H. (Inventor); Hodis, Howard N. (Inventor)

    2011-01-01

    A standardized acquisition methodology assists operators to accurately replicate high resolution B-mode ultrasound images obtained over several spaced-apart examinations utilizing a split-screen display in which the arterial ultrasound image from an earlier examination is displayed on one side of the screen while a real-time "live" ultrasound image from a current examination is displayed next to the earlier image on the opposite side of the screen. By viewing both images, whether simultaneously or alternately, while manually adjusting the ultrasound transducer, an operator is able to bring into view the real-time image that best matches a selected image from the earlier ultrasound examination. Utilizing this methodology, dynamic material properties of arterial structures, such as IMT and diameter, are measured in a standard region over successive image frames. Each frame of the sequence has its echo edge boundaries automatically determined by using the immediately prior frame's true echo edge coordinates as initial boundary conditions. Computerized echo edge recognition and tracking over multiple successive image frames enhances measurement of arterial diameter and IMT and allows for improved vascular dimension measurements, including vascular stiffness and IMT determinations.

  19. Acquisition of the linearization process in text composition in third to ninth graders: effects of textual superstructure and macrostructural organization.

    PubMed

    Favart, Monik; Coirier, Pierre

    2006-07-01

    Two complementary experiments analyzed the acquisition of text content linearization in writing, in French-speaking participants from third to ninth grades. In both experiments, a scrambled text paradigm was used: eleven ideas presented in random order had to be rearranged coherently so as to compose a text. Linearization was analyzed on the basis of the conceptual ordering of ideas and writing fluency. The first experiment focused on the effect of superstructural facilitation (in decreasing order: 1--instructional, 2--narrative, 3--argumentative), while the second experiment studied the effect of prewriting conditions: 1-scrambled presentation, 2--macrostructural facilitation, 3--ideas given in optimal order (control condition). As expected, scores in conceptual ordering and writing fluency improved through the grade levels. Students were most successful with respect to conceptual ordering in the instructional superstructure, followed by the narrative and finally the argumentative superstructures. The prewriting assignment also had the expected effect (control better than macrostructural presentation which, in turn, was better than the random order) but only with the argumentative superstructure. Contrary to conceptual ordering, writing fluency was not affected by the type of superstructure, although we did record an effect of the prewriting condition. The results are discussed in light of Bereiter and Scardamalia's knowledge transforming strategy (1987) taking into account cognitive development and French language curriculum.

  20. Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations

    NASA Technical Reports Server (NTRS)

    Christoffersen, Roy; Keller, L. P.

    2014-01-01

    Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.

  1. Contribution of working memory processes to relational matching-to-sample performance in baboons (Papio papio).

    PubMed

    Maugard, Anaïs; Marzouki, Yousri; Fagot, Joël

    2013-11-01

    Recent studies of monkeys and apes have shown that these animals can solve relational-matching-to-sample (RMTS) problems, suggesting basic abilities for analogical reasoning. However, doubts remain as to the actual cognitive strategies adopted by nonhuman primates in this task. Here, we used dual-task paradigms to test 10 baboons in the RMTS problem under three conditions of memory load. Our three test conditions allowed different predictions, depending on the strategy (i.e., flat memorization of the percept, reencoding of the percept, or relational processing) that they might use to solve RMTS problems. Results support the idea that the baboons process both the items and the abstract (same and different) relations in this task.

  2. Merger and acquisition medicine.

    PubMed

    Powell, G S

    1997-01-01

    This discussion of the ramifications of corporate mergers and acquisitions for employees recognizes that employee adaptation to the change can be a long and complex process. The author describes a role the occupational physician can take in helping to minimize the potential adverse health impact of major organizational change.

  3. Second Language Acquisition.

    ERIC Educational Resources Information Center

    McLaughlin, Barry; Harrington, Michael

    1989-01-01

    A distinction is drawn between representational and processing models of second-language acquisition. The first approach is derived primarily from linguistics, the second from psychology. Both fields, it is argued, need to collaborate more fully, overcoming disciplinary narrowness in order to achieve more fruitful research. (GLR)

  4. Development of a simple device for processing whole blood samples into measured aliquots of plasma

    SciTech Connect

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1986-01-01

    A capillary processor and aliquoter (CPA) has been designed and fabricated that is capable of accepting aliquots of whole blood and automatically processing them into discrete aliquots of plasma. The device consists of two disks, each of which contains 16 individual capillaries and a processing rotor. One of the disks accepts larger capillaries, each of which will hold approx. 100 ..mu..L of whole blood. The second disk, which accepts 2.54-cm-long precision capillaries of varying internal diameter, provides for exact sample volumes ranging from 1 to 10 ..mu..L. The processing rotor consists of 16 individual compartments and chambers to accept both disks. Gravimetric and photometric evaluation of the CPA indicates that it is capable of entraining and delivering microliter volumes of liquids with a degree of precision and accuracy (1 to 2%) approaching that of a state-of-the-art mechanical pipette. In addition, we have demonstrated that aliquots of whole blood can be transferred into the chambers of the processing unit and separated into their cellular and plasma fractions, which can then be analyzed with an acceptable degree of precision (i.e., C.V.s of approx. +-3% for serum enzyme measurements). 15 refs., 6 figs., 4 tbls.

  5. Sampling strategies and post-processing methods for increasing the time resolution of organic aerosol measurements requiring long sample-collection times

    NASA Astrophysics Data System (ADS)

    Modini, Rob L.; Takahama, Satoshi

    2016-07-01

    The composition and properties of atmospheric organic aerosols (OAs) change on timescales of minutes to hours. However, some important OA characterization techniques typically require greater than a few hours of sample-collection time (e.g., Fourier transform infrared (FTIR) spectroscopy). In this study we have performed numerical modeling to investigate and compare sample-collection strategies and post-processing methods for increasing the time resolution of OA measurements requiring long sample-collection times. Specifically, we modeled the measurement of hydrocarbon-like OA (HOA) and oxygenated OA (OOA) concentrations at a polluted urban site in Mexico City, and investigated how to construct hourly resolved time series from samples collected for 4, 6, and 8 h. We modeled two sampling strategies - sequential and staggered sampling - and a range of post-processing methods including interpolation and deconvolution. The results indicated that relative to the more sophisticated and costly staggered sampling methods, linear interpolation between sequential measurements is a surprisingly effective method for increasing time resolution. Additional error can be added to a time series constructed in this manner if a suboptimal sequential sampling schedule is chosen. Staggering measurements is one way to avoid this effect. There is little to be gained from deconvolving staggered measurements, except at very low values of random measurement error (< 5 %). Assuming 20 % random measurement error, one can expect average recovery errors of 1.33-2.81 µg m-3 when using 4-8 h-long sequential and staggered samples to measure time series of concentration values ranging from 0.13-29.16 µg m-3. For 4 h samples, 19-47 % of this total error can be attributed to the process of increasing time resolution alone, depending on the method used, meaning that measurement precision would only be improved by 0.30-0.75 µg m-3 if samples could be collected over 1 h instead of 4 h. Devising a

  6. Some results of processing NURE geochemical sampling in the northern Rocky Mountain area

    SciTech Connect

    Thayer, P.A.; Cook, J.R.; Price, V. Jr.

    1980-01-01

    The National Uranium Resource Evaluation (NURE) program was begun in the spring of 1973 to evaluate domestic uranium resources in the continental United States and to identify areas favorable for uranium exploration. The significance of the distribution of uranium in natural waters and sediments will be assessed as an indicator of favorable areas for the discovery of uranium deposits. This paper is oriented primarily to the discussion of stream sediments. Data for the Challis 1/sup 0/ x 2/sup 0/ NTMS quadrangle will be used for specific samples of NURE data processing. A high-capacity neutron activation analysis facility at SRL is used to determine uranium and about 19 other elements in hydrogeochemical samples. Evaluation of the areal distributions of uranium ratios demonstrate that most of the high U/Hf, U/Th and U/(Th + Hf) ratios occur scattered throughout the western two-thirds of the quadrangle. Most of the higher ratio values are found in samples taken at sites underlain by granitic rocks of the Idaho batholith or Tertiary-age plutons.

  7. Application of a Dual-Arm Robot in Complex Sample Preparation and Measurement Processes.

    PubMed

    Fleischer, Heidi; Drews, Robert Ralf; Janson, Jessica; Chinna Patlolla, Bharath Reddy; Chu, Xianghua; Klos, Michael; Thurow, Kerstin

    2016-10-01

    Automation systems with applied robotics have already been established in industrial applications for many years. In the field of life sciences, a comparable high level of automation can be found in the areas of bioscreening and high-throughput screening. Strong deficits still exist in the development of flexible and universal fully automated systems in the field of analytical measurement. Reasons are the heterogeneous processes with complex structures, which include sample preparation and transport, analytical measurements using complex sensor systems, and suitable data analysis and evaluation. Furthermore, the use of nonstandard sample vessels with various shapes and volumes results in an increased complexity. The direct use of existing automation solutions from bioscreening applications is not possible. A flexible automation system for sample preparation, analysis, and data evaluation is presented in this article. It is applied for the determination of cholesterol in biliary endoprosthesis using gas chromatography-mass spectrometry (GC-MS). A dual-arm robot performs both transport and active manipulation tasks to ensure human-like operation. This general robotic concept also enables the use of manual laboratory devices and equipment and is thus suitable in areas with a high standardization grade.

  8. Data processing pipeline for a time-sampled imaging Fourier transform spectrometer

    NASA Astrophysics Data System (ADS)

    Naylor, David A.; Fulton, Trevor R.; Davis, Peter W.; Chapman, Ian M.; Gom, Brad G.; Spencer, Locke D.; Lindner, John V.; Nelson-Fitzpatrick, Nathan E.; Tahic, Margaret K.; Davis, Gary R.

    2004-10-01

    Imaging Fourier transform spectrometers (IFTS) are becoming the preferred systems for remote sensing spectral imaging applications because of their ability to provide, simultaneously, both high spatial and spectral resolution images of a scene. IFTS can be operated in either step-and-integrate or rapid-scan modes, where it is common practice to sample interferograms at equal optical path difference intervals. The step-and-integrate mode requires a translation stage with fast and precise point-to-point motion and additional external trigger circuitry for the detector focal plane array (FPA), and produces uniformly position-sampled interferograms which can be analyzed using standard FFT routines. In the rapid-scan mode, the translation stage is continuously moving and interferograms are often acquired at the frame-rate of the FPA. Since all translation stages have associated velocity errors, the resulting interferograms are sampled at non-uniform intervals of optical path difference, which requires more sophisticated analysis. This paper discusses the processing pipeline which is being developed for the analysis of the non-uniform rapid-scan data produced by the Herschel/SPIRE IFTS.

  9. Microstructural Evaluation and Comparison of Solder Samples Processed Aboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.; Hua, F.; Anilkumar, A. V.

    2008-01-01

    Samples from the In-Space Soldering Investigation (ISSI), conducted aboard the International Space Station (ISS), are being examined for post-solidification microstructural development and porosity distribution. In this preliminary study, the internal structures of two ISSI processed samples are compared. In one case 10cm of rosin-core solder was wrapped around a coupon wire and melted by conduction, whereas, in the other a comparable length of solder was melted directly onto the hot wire; in both cases the molten solder formed ellipsoidal blobs, a shape that was maintained during subsequent solidification. In the former case, there is clear evidence of porosity throughout the sample, and an accumulation of larger pores near the hot end that implies thermocapillary induced migration and eventual coalescence of the flux vapor bubbles. In the second context, when solder was fed onto the wire. a part of the flux constituting the solder core is introduced into and remains within the liquid solder ball, becoming entombed upon solidification. In both cases the consequential porosity, particularly at a solder/contact interface, is very undesirable. In addition to compromising the desired electrical and thermal conductivity, it promotes mechanical failure.

  10. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  11. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  12. Effects of Subsurface Sampling & Processing on Martian Simulant Containing Varying Quantities of Water

    NASA Technical Reports Server (NTRS)

    Menard, J.; Sangillo, J.; Savain, A.; McNamara, K. M.

    2004-01-01

    The presence of water-ice in the Martian subsurface is a subject of much debate and excited speculation. Recent results from the gammaray spectrometer (GRS) on board NASA's Mars Odyssey spacecraft indicate the presence of large amounts of hydrogen in regions of predicted ice stability. The combination of chemistry, low gravitational field (3.71 m/s(exp 2)) and a surface pressure of about 6.36 mbar at the mean radius, place limits on the stability of H2O on the surface, however, results from the GRS indicate that the hydrogen rich phase may be present at a depth as shallow as one meter in some locations on Mars. The potential for water on Mars leads directly to the speculation that life may once have existed there, since liquid water is the unifying factor for environments known to support life on Earth. Lubricant-free drilling has been considered as a means of obtaining water-rich subsurface samples on Mars, and two recent white papers sponsored by the Mars Program have attempted to identify the problems associated with this goal. The two major issues identified were: the engineering challenges of drilling into a water-soil mixture where phase changes may occur; and the potential to compromise the integrity of in-situ scientific analysis due to contamination, volatilization, and mineralogical or chemical changes as a result of processing. This study is a first attempt to simulate lubricantfree drilling into JSC Mars-1 simulant containing up to 50% water by weight. The goal is to address the following: 1) Does sample processing cause reactions or changes in mineralogy which will compromise the interpretation of scientific measurements conducted on the surface? 2) Does the presence of water-ice in the sample complicate (1)? 3) Do lubricant-free drilling and processing leave trace contaminants which may compromise our understanding of sample composition? 4) How does the torque/power required for drilling change as a function of water content and does this lead to

  13. Wavelet data processing of micro-Raman spectra of biological samples

    NASA Astrophysics Data System (ADS)

    Camerlingo, C.; Zenone, F.; Gaeta, G. M.; Riccio, R.; Lepore, M.

    2006-02-01

    A wavelet multi-component decomposition algorithm is proposed for processing data from micro-Raman spectroscopy (μ-RS) of biological tissue. The μ-RS has been recently recognized as a promising tool for the biopsy test and in vivo diagnosis of degenerative human tissue pathologies, due to the high chemical and structural information contents of this spectroscopic technique. However, measurements of biological tissues are usually hampered by typically low-level signals and by the presence of noise and background components caused by light diffusion or fluorescence processes. In order to overcome these problems, a numerical method based on discrete wavelet transform is used for the analysis of data from μ-RS measurements performed in vitro on animal (pig and chicken) tissue samples and, in a preliminary form, on human skin and oral tissue biopsy from normal subjects. Visible light μ-RS was performed using a He-Ne laser and a monochromator with a liquid nitrogen cooled charge coupled device equipped with a grating of 1800 grooves mm-1. The validity of the proposed data procedure has been tested on the well-characterized Raman spectra of reference acetylsalicylic acid samples.

  14. Language Acquisition, Pidgins and Creoles.

    ERIC Educational Resources Information Center

    Wode, Henning

    1981-01-01

    Suggests that structural universals between different-based pidgins result from universal linguo-cognitive processing strategies which are employed in learning languages. Some of the strategies occur in all types of acquisition, and others are more applicable to L2 type acquisition. Past research is discussed, and some exemplary data are given.…

  15. How Students Learn: The Validation of a Model of Knowledge Acquisition Using Stimulated Recall of the Learning Process.

    ERIC Educational Resources Information Center

    Nuthall, Graham

    A study of students' thinking processes during their engagement in classroom tasks in science and social studies units in upper elementary school classrooms was conducted as part of a series of studies on learning. As a result of previous studies, a theory of the learning process has been developed. A central component of the theory is the…

  16. 77 FR 9617 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... Supplement (DFARS) to update DoD's voucher processing procedures and better accommodate the use of Wide Area WorkFlow to process vouchers. DATES: Comments on the proposed rule published January 19, 2012, at 77 FR... clarifying the proposed rule published on January 19, 2012 (77 FR 2682), which proposes to...

  17. 77 FR 52258 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... a proposed rule in the Federal Register at 77 FR 2682 on January 19, 2012. The comment period closed... the Wide Area WorkFlow (WAWF) used to process vouchers. DATES: August 29, 2012. FOR FURTHER... rule merely updates DoD's voucher processing procedures and better accommodates the Wide Area...

  18. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AND THE EGG PRODUCTS INSPECTION ACT PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification Sampling § 52... by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations...

  19. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AND THE EGG PRODUCTS INSPECTION ACT PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification Sampling § 52... by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations...

  20. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... PRODUCTS INSPECTION ACT PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification Sampling § 52.38c Statistical... processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of...

  1. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... PRODUCTS INSPECTION ACT PROCESSED FRUITS AND VEGETABLES, PROCESSED PRODUCTS THEREOF, AND CERTAIN OTHER PROCESSED FOOD PRODUCTS 1 Regulations Governing Inspection and Certification Sampling § 52.38c Statistical... processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of...

  2. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for...

  3. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for...

  4. CHARACTERIZATION OF A PRECIPITATE REACTOR FEED TANK (PRFT) SAMPLE FROM THE DEFENSE WASTE PROCESSING FACILITY (DWPF)

    SciTech Connect

    Crawford, C.; Bannochie, C.

    2014-05-12

    A sample of from the Defense Waste Processing Facility (DWPF) Precipitate Reactor Feed Tank (PRFT) was pulled and sent to the Savannah River National Laboratory (SRNL) in June of 2013. The PRFT in DWPF receives Actinide Removal Process (ARP)/ Monosodium Titanate (MST) material from the 512-S Facility via the 511-S Facility. This 2.2 L sample was to be used in small-scale DWPF chemical process cell testing in the Shielded Cells Facility of SRNL. A 1L sub-sample portion was characterized to determine the physical properties such as weight percent solids, density, particle size distribution and crystalline phase identification. Further chemical analysis of the PRFT filtrate and dissolved slurry included metals and anions as well as carbon and base analysis. This technical report describes the characterization and analysis of the PRFT sample from DWPF. At SRNL, the 2.2 L PRFT sample was composited from eleven separate samples received from DWPF. The visible solids were observed to be relatively quick settling which allowed for the rinsing of the original shipping vials with PRFT supernate on the same day as compositing. Most analyses were performed in triplicate except for particle size distribution (PSD), X-ray diffraction (XRD), Scanning Electron Microscopy (SEM) and thermogravimetric analysis (TGA). PRFT slurry samples were dissolved using a mixed HNO3/HF acid for subsequent Inductively Coupled Plasma Atomic Emission Spectroscopy (ICPAES) and Inductively Coupled Plasma Mass Spectroscopy (ICP-MS) analyses performed by SRNL Analytical Development (AD). Per the task request for this work, analysis of the PRFT slurry and filtrate for metals, anions, carbon and base were primarily performed to support the planned chemical process cell testing and to provide additional component concentrations in addition to the limited data available from DWPF. Analysis of the insoluble solids portion of the PRFT slurry was aimed at detailed characterization of these solids (TGA, PSD

  5. The Osiris-Rex Mission - Sample Acquisitions Strategy and Evidence for the Nature of Regolith on Asteroid (101955) 1999 RQ36

    NASA Technical Reports Server (NTRS)

    Lauretta, D. S.; Barucci, M. A.; Bierhaus, E. B.; Brucato, J. R.; Campins, H.; Christensen, P. R.; Clark, B. C.; Connolly, H. C.; Dotto, E.; Dworkin, J. P.; Emery, J.; Garvin, J. B.; Hildebrand, A. R.; Libourel, G.; Marshall, J. R.; Michel, P.; Nolan, M. C.; Nuth, J. A.; Rizk, B.; Sandford, S. A.; Scheeres, D. J.

    2012-01-01

    NASA selected the OSIRIS-REx Asteroid Sample Return Mission as the third New Frontiers mission in May 2011 [I]. The mission name is an acronym that captures the scientific objectives: Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer. OSIRIS-REx will characterize near-Earth asteroid (101955) 1999 RQ36, which is both the most accessible carbonaceous asteroid [2,3] and one of the most potentially hazardous asteroids known [4]. The primary objective of the mission is to return a pristine sample from this bod, to advance our understanding of the generation, evolution, and maturation of regolith on small bodies.

  6. Contribution of Sample Processing to Variability and Accuracy of the Results of Pesticide Residue Analysis in Plant Commodities.

    PubMed

    Ambrus, Árpád; Buczkó, Judit; Hamow, Kamirán Á; Juhász, Viktor; Solymosné Majzik, Etelka; Szemánné Dobrik, Henriett; Szitás, Róbert

    2016-08-10

    Significant reduction of concentration of some pesticide residues and substantial increase of the uncertainty of the results derived from the homogenization of sample materials have been reported in scientific papers long ago. Nevertheless, performance of methods is frequently evaluated on the basis of only recovery tests, which exclude sample processing. We studied the effect of sample processing on accuracy and uncertainty of the measured residue values with lettuce, tomato, and maize grain samples applying mixtures of selected pesticides. The results indicate that the method is simple and robust and applicable in any pesticide residue laboratory. The analytes remaining in the final extract are influenced by their physical-chemical properties, the nature of the sample material, the temperature of comminution of sample, and the mass of test portion extracted. Consequently, validation protocols should include testing the effect of sample processing, and the performance of the complete method should be regularly checked within internal quality control. PMID:26755282

  7. Collection and processing of plant, animal and soil samples from Bikini, Enewetak and Rongelap Atolls

    SciTech Connect

    Stuart, M.L.

    1995-09-01

    The United States used the Marshall Islands for its nuclear weapons program testing site from 1946 to 1958. The BRAVO test was detonated at Bikini Atoll on March 1, 1954. Due to shifting wind conditions at the time of the nuclear detonation, many of the surrounding Atolls became contaminated with fallout (radionuclides carried by the wind currents). Lawrence Livermore National Laboratory`s (LLNL) Marshall Islands Project has been responsible for the collecting, processing, and analyzing of food crops, vegetation, soil, water, animals, and marine species to characterize the radionuclides in the environment, and to estimate dose at atolls that may have been contaminated. Tropical agriculture experiments reducing the uptake of {sup 137}Cs have been conducted on Bikini Atoll. The Marshall Islands field team and laboratory processing team play an important role in the overall scheme of the Marshall Islands Dose Assessment and Radioecology Project. This report gives a general description of the Marshall Islands field sampling and laboratory processing procedures currently used by our staff.

  8. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  9. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process for... architect-engineer firms in accordance with 36.602-3, except that the selection report shall serve as...

  10. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process for....602-5 may be used to select firms for architect-engineer contracts that are not expected to exceed...

  11. Hospitals changing their buying habits. Overhauled technology-acquisition processes help equip facilities to make prudent purchases.

    PubMed

    Wagner, M

    1990-11-26

    As hospitals face increasing pressure to rein in costs, equipment spending faces stiff competition for limited funds. When facilities replace aging or outdated equipment, they're often replacing the entire technology assessment process as well. One hospital facing a $4 million bill to equip a new building is revamping its purchasing process based on department "wish lists." And an Ohio system has formed a special division to speed assessment and implementation of new technologies and procedures.

  12. Field guide for collecting and processing stream-water samples for the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Shelton, Larry R.

    1994-01-01

    The U.S. Geological Survey's National Water-Quality Assessment program includes extensive data- collection efforts to assess the quality of the Nations's streams. These studies require analyses of stream samples for major ions, nutrients, sediments, and organic contaminants. For the information to be comparable among studies in different parts of the Nation, consistent procedures specifically designed to produce uncontaminated samples for trace analysis in the laboratory are critical. This field guide describes the standard procedures for collecting and processing samples for major ions, nutrients, organic contaminants, sediment, and field analyses of conductivity, pH, alkalinity, and dissolved oxygen. Samples are collected and processed using modified and newly designed equipment made of Teflon to avoid contamination, including nonmetallic samplers (D-77 and DH-81) and a Teflon sample splitter. Field solid-phase extraction procedures developed to process samples for organic constituent analyses produce an extracted sample with stabilized compounds for more accurate results. Improvements to standard operational procedures include the use of processing chambers and capsule filtering systems. A modified collecting and processing procedure for organic carbon is designed to avoid contamination from equipment cleaned with methanol. Quality assurance is maintained by strict collecting and processing procedures, replicate sampling, equipment blank samples, and a rigid cleaning procedure using detergent, hydrochloric acid, and methanol.

  13. Materials processing issues for non-destructive laser gas sampling (NDLGS)

    SciTech Connect

    Lienert, Thomas J

    2010-12-09

    The Non-Destructive Laser Gas Sampling (NDLGS) process essentially involves three steps: (1) laser drilling through the top of a crimped tube made of 304L stainles steel (Hammar and Svennson Cr{sub eq}/Ni{sub eq} = 1.55, produced in 1985); (2) gas sampling; and (3) laser re-welding of the crimp. All three steps are performed in a sealed chamber with a fused silica window under controlled vacuum conditions. Quality requirements for successful processing call for a hermetic re-weld with no cracks or other defects in the fusion zone or HAZ. It has been well established that austenitic stainless steels ({gamma}-SS), such as 304L, can suffer from solidification cracking if their Cr{sub eq}/Ni{sub eq} is below a critical value that causes solidification to occur as austenite (fcc structure) and their combined impurity level (%P+%S) is above {approx}0.02%. Conversely, for Cr{sub eq}/Ni{sub eq} values above the critical level, solidification occurs as ferrite (bcc structure), and cracking propensity is greatly reduced at all combined impurity levels. The consensus of results from studies of several researchers starting in the late 1970's indicates that the critical Cr{sub eq}/Ni{sub eq} value is {approx}1.5 for arc welds. However, more recent studies by the author and others show that the critical Cr{sub eq}/Ni{sub eq} value increases to {approx}1 .6 for weld processes with very rapid thermal cycles, such as the pulsed Nd:YAG laser beam welding (LBW) process used here. Initial attempts at NDLGS using pulsed LBW resulted in considerable solidification cracking, consistent with the results of work discussed above. After a brief introduction to the welding metallurgy of {gamma}-SS, this presentation will review the results of a study aimed at developing a production-ready process that eliminates cracking. The solution to the cracking issue, developed at LANL, involved locally augmenting the Cr content by applying either Cr or a Cr-rich stainless steel (ER 312) to the top of

  14. XSTREAM: A Highly Efficient High Speed Real-time Satellite Data Acquisition and Processing System using Heterogeneous Computing

    NASA Astrophysics Data System (ADS)

    Pramod Kumar, K.; Mahendra, P.; Ramakrishna rReddy, V.; Tirupathi, T.; Akilan, A.; Usha Devi, R.; Anuradha, R.; Ravi, N.; Solanki, S. S.; Achary, K. K.; Satish, A. L.; Anshu, C.

    2014-11-01

    In the last decade, the remote sensing community has observed a significant growth in number of satellites, sensors and their resolutions, thereby increasing the volume of data to be processed each day. Satellite data processing is a complex and time consuming activity. It consists of various tasks, such as decode, decrypt, decompress, radiometric normalization, stagger corrections, ephemeris data processing for geometric corrections etc., and finally writing of the product in the form of an image file. Each task in the processing chain is sequential in nature and has different computing needs. Conventionally the processes are cascaded in a well organized workflow to produce the data products, which are executed on general purpose high-end servers / workstations in an offline mode. Hence, these systems are considered to be ineffective for real-time applications that require quick response and just-intime decision making such as disaster management, home land security and so on. This paper discusses anovel approach to processthe data online (as the data is being acquired) using a heterogeneous computing platform namely XSTREAM which has COTS hardware of CPUs, GPUs and FPGA. This paper focuses on the process architecture, re-engineering aspects and mapping of tasks to the right computing devicewithin the XSTREAM system, which makes it an ideal cost-effective platform for acquiring, processing satellite payload data in real-time and displaying the products in original resolution for quick response. The system has been tested for IRS CARTOSAT and RESOURCESAT series of satellites which have maximum data downlink speed of 210 Mbps.

  15. Sample processing and cDNA preparation for microbial metatranscriptomics in complex soil communities.

    PubMed

    Carvalhais, Lilia C; Schenk, Peer M

    2013-01-01

    Soil presents one of the most complex environments for microbial communities as it provides many microhabitats that allow coexistence of thousands of species with important ecosystem functions. These include biomass and nutrient cycling, mineralization, and detoxification. Culture-independent DNA-based methods, such as metagenomics, have revealed operational taxonomic units that suggest a high diversity of microbial species and associated functions in soil. An emerging but technically challenging area to profile the functions of microorganisms and their activities is mRNA-based metatranscriptomics. Here, we describe issues and important considerations of soil sample processing and cDNA preparation for metatranscriptomics from bacteria and archaea and provide a set of methods that can be used in the required experimental steps.

  16. A Research on Second Language Acquisition and College English Teaching

    ERIC Educational Resources Information Center

    Li, Changyu

    2009-01-01

    It was in the 1970s that American linguist S.D. Krashen created the theory of "language acquisition". The theories on second language acquisition were proposed based on the study on the second language acquisition process and its rules. Here, the second language acquisition process refers to the process in which a learner with the…

  17. Single-Image Super-Resolution Using Active-Sampling Gaussian Process Regression.

    PubMed

    Wang, Haijun; Gao, Xinbo; Zhang, Kaibing; Li, Jie

    2016-02-01

    As well known, Gaussian process regression (GPR) has been successfully applied to example learning-based image super-resolution (SR). Despite its effectiveness, the applicability of a GPR model is limited by its remarkably computational cost when a large number of examples are available to a learning task. For this purpose, we alleviate this problem of the GPR-based SR and propose a novel example learning-based SR method, called active-sampling GPR (AGPR). The newly proposed approach employs an active learning strategy to heuristically select more informative samples for training the regression parameters of the GPR model, which shows significant improvement on computational efficiency while keeping higher quality of reconstructed image. Finally, we suggest an accelerating scheme to further reduce the time complexity of the proposed AGPR-based SR by using a pre-learned projection matrix. We objectively and subjectively demonstrate that the proposed method is superior to other competitors for producing much sharper edges and finer details. PMID:26841394

  18. A modified protocol for the comet assay allowing the processing of multiple samples.

    PubMed

    Zhang, Lai-Jun; Jia, Jing-Fen; Hao, Jian-Guo; Cen, Ju-Ren; Li, Tian-Ke

    2011-04-01

    In the present study, we developed a modified protocol for the basic comet assay that increased efficiency without sacrificing assay reliability. A spreader was used to spread agarose-embedded cells on a slide, making the manipulation and processing of multiple samples easier. Using this technique, we are able to rapidly prepare five or more comet assay samples on one slide. To demonstrate the effect of the protocol modifications on assay reliability, we present an example of how the comet assay was used in our laboratory to analyze the effect of melatonin (N-acetyl-5-methoxitryptamine; MEL) on the DNA repair ability of Gentiana macrophylla Pall. protoplasts after irradiation with different doses of ultraviolet-B radiation. A slight, but statistically significant (P<0.01), dose-related protective effect of MEL was observed in our experiments. The first use of the comet assay was to confirm the antioxidant and DNA repair functions of MEL in plants. The modified protocol is cost-effective and provides substantial advantages over the conventional comet assay.

  19. Application of enhanced sampling methods to mineral nucleation and growth processes

    NASA Astrophysics Data System (ADS)

    Wallace, A. F.

    2013-12-01

    Mineral nucleation and growth are amongst the most critical processes occurring in natural environments. However, even with high-resolution in situ techniques such as Atomic Force Microscopy (AFM), mechanistic details must typically be inferred from kinetic measurements. Computational methods are potentially powerful tools which may assist in understanding aspects of mineral reactivity, however, in practice standard approaches effectively probe only those processes whose associated activation barriers are comparable to the ambient thermal energy (kBT). Therefore, due to inherent limitations on the simulation accessible timescale, many reactions of geochemical interest continue to challenge existing computational strategies. Enhanced sampling methods increase the rate at which rare events occur in atomistic simulations by accelerating the exploration of the free energy landscape. Here, two such methods are applied to aspects of calcium carbonate nucleation and growth. Replica exchange molecular dynamics (REMD) is used to explore the initial formation of hydrated mineral clusters from solution (Wallace et al., in press, Science). Characterization of the thermodynamic and dynamic properties of the clusters suggests that a dense liquid phase of calcium carbonate forms under certain conditions. Additionally, it is demonstrated that coalescence of the dense liquid products of the liquid-liquid separation results in the formation of a solid phase whose structure is consistent with amorphous calcium carbonate. Results from forward flux sampling (FFS) simulations are also presented. The rate of solvent exchange about calcium ions in solution is rapid enough to be determined directly from a standard molecular dynamics simulation and is used in this instance to calibrate the FFS method. The calibrated procedure is then applied to obtain preliminary rates of ion attachment and detachment from calcite surfaces.

  20. Chemical properties and hydrothermal processes on the first two directly sampled deep-sea eruptions (Invited)

    NASA Astrophysics Data System (ADS)

    Butterfield, D. A.; Resing, J. A.; Roe, K. K.; Christensen, M.; Embley, R. W.; Lupton, J. E.; Chadwick, W.

    2009-12-01

    To understand the effects of deep-sea volcanic eruptions on oceanic chemistry, on the ecology of hydrothermal vent communities, on microbial communities in the sub-seafloor biosphere, and on the alteration of oceanic lithosphere requires direct observation and sampling of active eruption sites. Known mid-ocean ridge eruptions have so far been too brief to observe and sample, but a nearly continuous eruption at NW Rota-1 submarine volcano in the Mariana arc (2004-2009) and a potentially long-term eruption at West Mata volcano in the NE Lau Basin (detected Nov. 2008) have provided unprecedented access to magma degassing and rapid water-rock reaction processes that may typify active submarine arc volcanism. How closely this resembles the hydrothermal processes associated with mid-ocean ridge volcanism remains to be seen. NW Rota-1 has a significantly higher output of a free gas phase, but based on initial observations of fluid chemistry and venting types, NW Rota-1 and W Mata have much in common. Active hydrothermal venting was found within a depth horizon encompassing the top 100 meters of the summit peak on both volcanoes (520 m at Rota; 1200 m at Mata). The dominant particulate and chemical plumes originate at active volcanic vents. The hydrothermal chemistry of these volcanic vents is dominated by the condensation of magmatic sulfur dioxide gas, its dissolution into seawater, and subsequent acid attack on volcanic rock. Disproportionation of SO2 to elemental sulfur, H2S, and sulfuric acid occurs. Percolation of hot, acidic fluids through volcaniclastic deposits results in rapid uptake of iron, aluminum, and other metals into solution. Chemical compositions and models indicate that continued water/rock reaction, cooling, and sub-surface mixing with seawater result in rising pH and precipitation of sulfur, alunite, anhydrite, iron sulfides, and iron oxyhydroxides (in order of increasing pH and decreasing temperature). Venting fluids sampled directly out of the