Science.gov

Sample records for acquisition sample processing

  1. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  2. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  3. An Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Hurowitz, J. A.; Jandura, L.; Limonadi, D.

    2012-12-01

    The Mars Science Laboratory Mission (MSL), landed on Mars on August 5. The rover and a scientific payload are designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem is the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)). It is expected that the SA/SPaH system will have produced a scooped system and possibility a drilled sample in the first 90 sols of the mission. Results from these activities and the ongoing testing program will be presented.

  4. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.

    2012-09-01

    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  5. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  6. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality of the System after being on the Surface for Two Years.

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Abbey, W. J.

    2014-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system. SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Curiosity rover preformed several sample acquisitions and processing of solid samples during its first year of operation. Material were processed and delivered to the two analytical instruments, Chemistry and Mineralogy (CheMin) and Sample Analysis at Mars (SAM), both of which required specific particle size for the material delivered to them to perform their analysis to determine its mineralogy and geochemistry content. In this presentation, the functionality of the system will be explained along with the in-situ targets the system has acquire and the samples that were delivered.

  7. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; Slimko, Eric; Limonadi, Daniel

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  8. Rockballer Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Giersch, Louis R.; Cook, Brant T.

    2013-01-01

    It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.

  9. Resource Prospector Instrumentation for Lunar Volatiles Prospecting, Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Captain, J.; Elphic, R.; Colaprete, A.; Zacny, Kris; Paz, A.

    2016-01-01

    the traverse path. The NS will map the water-equivalent hydrogen concentration as low as 0.5% by weight to an 80 centimeter depth as the rover traverses the lunar landscape. The NIR spectrometer will measure surficial H2O/OH as well as general mineralogy. When the prospecting instruments identify a potential volatile-rich area during the course of a traverse, the prospect is then mapped out and the most promising location identified. An augering drill capable of sampling to a depth of 100 centimeters will excavate regolith for analysis. A quick assay of the drill cuttings will be made using an operations camera and NIR spectrometer. With the water depth confirmed by this first auguring activity, a regolith sample may be extracted for processing. The drill will deliver the regolith sample to a crucible that will be sealed and heated. Evolved volatiles will be measured by a gas chromatograph-mass spectrometer and the water will be captured and photographed. RP is a solar powered mission, which given the polar location translates to a relatively short mission duration on the order of 4-15 days. This short mission duration drives the concept of operations, instrumentation, and data analysis towards critical real time analysis and decision support. Previous payload field tests have increased the fidelity of the hardware, software, and mission operations. Current activities include a mission level field test to optimize interfaces between the payload and rover as well as better understand the interaction of the science and rover teams during the mission timeline. This paper will include the current status of the science instruments on the payload as well as the integrated field test occurring in fall of 2015. The concept of operations will be discussed, including the real time science and engineering decision-making process based on the critical data from the instrumentation. The path to flight will be discussed with the approach to this ambitious low cost mission.

  10. Sample Acquisition Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Carle, Glenn C.; Stratton, David M.; Valentin, Jose R.; DeVincenzi, Donald (Technical Monitor)

    1999-01-01

    Exobiology Flight Experiments involve complex analyses conducted in environments far different than those encountered in terrestrial applications. A major part of the analytical challenge is often the selection, acquisition, delivery and, in some cases, processing of a sample suitable for the analytical requirements of the mission. The added complications of severely limited resources and sometimes rigid time constraints combine to make sample acquisition potentially a major obstacle for successful analyses. Potential samples come in a wide range including planetary atmospheric gas and aerosols (from a wide variety of pressures), planetary soil or rocks, dust and ice particles streaming off of a comet, and cemetery surface ice and rocks. Methods to collect and process sample are often mission specific, requiring continual development of innovative concepts and mechanisms. These methods must also maintain the integrity of the sample for the experimental results to be meaningful. We present here sample acquisition systems employed from past missions and proposed for future missions.

  11. Rotary Percussive Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Klein, K.; Badescu, M.; Haddad, N.; Shiraishi, L.; Walkemeyer, P.

    2012-01-01

    As part of a potential Mars Sample Return campaign NASA is studying a sample caching mission to Mars, with a possible 2018 launch opportunity. As such, a Sample Acquisition Tool (SAT) has been developed in support of the Integrated Mars Sample Acquisition and Handling (IMSAH) architecture as it relates to the proposed Mars Sample Return (MSR) campaign. The tool allows for core generation and capture directly into a sample tube. In doing so, the sample tube becomes the fundamental handling element within the IMSAH sample chain reducing the risk associated with sample contamination as well as the need to handle a sample of unknown geometry. The tool's functionality was verified utilizing a proposed rock test suite that encompasses a series of rock types that have been utilized in the past to qualify Martian surface sampling hardware. The corresponding results have shown the tool can effectively generate, fracture, and capture rock cores while maintaining torque margins of no less than 50% with an average power consumption of no greater than 90W and a tool mass of less than 6kg.

  12. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  13. Sample acquisition and instrument deployment

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1995-01-01

    Progress is reported in developing the Sample Acquisition and Instrument Deployment (SAID) system, a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. The systems have been fabricated and tested in environmental chambers, as well as soil testing and robotic control testing.

  14. The alteration of icy samples during sample acquisition

    NASA Astrophysics Data System (ADS)

    Mungas, G.; Bearman, G.; Beegle, L. W.; Hecht, M.; Peters, G. H.; Glucoft, J.; Strothers, K.

    2006-12-01

    Valid in situ scientific studies require both that samples be analyzed in as pristine condition as possible and that any modification from the pristine to the sampled state be well understood. While samples with low to high ice concentration are critical for the study of astrobiology and geology, they pose problems with respect to the sample acquisition, preparation and distribution systems (SPAD) upon which the analytical instruments depend. Most significant of the processes that occur during SPAD is sublimation or melting caused by thermal loading from drilling, coring, etc. as well as exposure to a dry low pressure ambient environment. These processes can alter the sample, as well as generating, meta-stable liquid water that can refreeze in the sample transfer mechanisms, interfering with proper operation and creating cross-contamination. We have investigated and quantified loss of volatiles such as H2O, CO, CO2, and organics contained within icy and powdered samples when acquired, processed and transferred. During development of the MSL rock crusher, for example, ice was observed to pressure-fuse and stick to the side even at -70C. We have investigated sublimation from sample acquisition at Martian temperature and pressure for a samples ranging from 10 to 100 water/dirt ratios. Using the RASP that will be on Phoenix, we have measured sublimation of ice during excavation at Martian pressure and find that the sublimation losses can range from 10 to 50 percent water. It is the thermal conductivity of the soil that determines local heat transport, and how much of the sample acquisition energy is wicked away into the soil and how much goes into the sample. Modeling of sample acquisition methods requires measurement of these parameters. There is a two phase model for thermal conductivity as a function of dirt/ice ratio but it needed to be validated. We used an ASTM method for measuring thermal conductivity and implemented it in the laboratory. The major conclusion is

  15. Optimization of LC-Orbitrap-HRMS acquisition and MZmine 2 data processing for nontarget screening of environmental samples using design of experiments.

    PubMed

    Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias

    2016-11-01

    Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS(2) experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10(5) in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.

  16. Processes Asunder: Acquisition & Planning Misfits

    DTIC Science & Technology

    2009-03-26

    St ra te gy Re se ar ch Pr oj ec t PROCESSES ASUNDER: ACQUISITION & PLANNING MISFITS BY CHÉRIE A. SMITH Department of Army Civilian DISTRIBUTION...Asunder: Acquisition & Planning Misfits 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Chérie A. Smith 5d. PROJECT NUMBER 5e...include area code) Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 USAWC STRATEGY RESEARCH PROJECT PROCESSES ASUNDER: ACQUISITION & PLANNING

  17. Peruvian Weapon System Acquisition Process

    DTIC Science & Technology

    1990-12-01

    process for a major program. The United States DOD Directive 5000.1 defines four distinct phases of the acquisition process: concept exploration , demon...Unified or Specified Command. 1. Concept Exploration Phase The first phase for a major system is the concept exploration phase. During this phase... exploration phase proreses. Premature introduction of operating and support details may have a negative effect by dosing out promising alternatives [Ref

  18. Improving the Acquisition and Management of Sample Curation Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  19. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  20. Planetary protection issues for Mars sample acquisition flight projects.

    PubMed

    Barengoltz, J B

    2000-01-01

    The planned NASA sample acquisition flight missions to Mars pose several interesting planetary protection issues. In addition to the usual forward contamination procedures for the adequate protection of Mars for the sake of future missions, there are reasons to ensure that the sample is not contaminated by terrestrial microbes from the acquisition mission. Recent recommendations by the Space Studies Board (SSB) of the National Research Council (United States), would indicate that the scientific integrity of the sample is a planetary protection concern (SSB, 1997). Also, as a practical matter, a contaminated sample would interfere with the process for its release from quarantine after return for distribution to the interested scientists. These matters are discussed in terms of the first planned acquisition mission.

  1. Digital data acquisition and processing.

    PubMed

    Naivar, Mark A; Galbraith, David W

    2015-01-05

    A flow cytometer is made up of many different subsystems that work together to measure the optical properties of individual cells within a sample. The data acquisition system (also called the data system) is one of these subsystems, and it is responsible for converting the electrical signals from the optical detectors into list-mode data. This unit describes the inner workings of the data system, and provides insight into how the instrument functions as a whole. Some of the information provided in this unit is applicable to everyday use of these instruments, and, at minimum, should make it easier for the reader to assemble a specific data system. With the considerable advancement of electronics technology, it becomes possible to build an entirely functional data system using inexpensive hobbyist-level electronics. This unit covers both analog and digital data systems, but the primary focus is on the more prevalent digital data systems of modern flow cytometric instrumentation.

  2. Sample Acquisition and Instrument Deployment (SAID)

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1994-01-01

    This report details the interim progress for contract NASW-4818, Sample Acquisition and Instrument Deployment (SAID), a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. A passively braked shape memory actuator with the ability to measure load has been developed. The wrist also contains a mechanism which locks the lid output to the bucket so that objects can be grasped and released for instrument deployment. The wrist actuator has been tested for operational power and mechanical functionality at Mars environmental conditions. The torque which the actuator can produce has been measured. Also, testing in Mars analogous soils has been performed.

  3. Autonomous Surface Sample Acquisition for Planetary and Lunar Exploration

    NASA Astrophysics Data System (ADS)

    Barnes, D. P.

    2007-08-01

    Surface science sample acquisition is a critical activity within any planetary and lunar exploration mission, and our research is focused upon the design, implementation, experimentation and demonstration of an onboard autonomous surface sample acquisition capability for a rover equipped with a robotic arm upon which are mounted appropriate science instruments. Images captured by a rover stereo camera system can be processed using shape from stereo methods and a digital elevation model (DEM) generated. We have developed a terrain feature identification algorithm that can determine autonomously from DEM data suitable regions for instrument placement and/or surface sample acquisition. Once identified, surface normal data can be generated autonomously which are then used to calculate an arm trajectory for instrument placement and sample acquisition. Once an instrument placement and sample acquisition trajectory has been calculated, a collision detection algorithm is required to ensure the safe operation of the arm during sample acquisition.We have developed a novel adaptive 'bounding spheres' approach to this problem. Once potential science targets have been identified, and these are within the reach of the arm and will not cause any undesired collision, then the 'cost' of executing the sample acquisition activity is required. Such information which includes power expenditure and duration can be used to select the 'best' target from a set of potential targets. We have developed a science sample acquisition resource requirements calculation that utilises differential inverse kinematics methods to yield a high fidelity result, thus improving upon simple 1st order approximations. To test our algorithms a new Planetary Analogue Terrain (PAT) Laboratory has been created that has a terrain region composed of Mars Soil Simulant-D from DLR Germany, and rocks that have been fully characterised in the laboratory. These have been donated by the UK Planetary Analogue Field Study

  4. Candidate Sample Acquisition System for the Rosetta Mission

    NASA Astrophysics Data System (ADS)

    Magnani, P. G.; Gerli, C.; Colombina, G.; Vielmo, P.

    1997-12-01

    The Comet Nucleus Sample Return (CNSR) Mission, is a cornerstone of ESA scientific program. While Giotto, Vega I and II provided the first picture of a comet nucleus, a much improved understanding of the nucleus and processes on it will result from in situ measurements and Earth based analysis of the material samples collected on the nucleus surface. The CNSR baseline mission foresees the landing and anchoring of a spacecraft on the comet nucleus surface, and the collection of the following three types of samples by means of a dedicated "Sample Acquisition System" (SAS): (1) a core sample gathered from surface down to a maximum depth of 3 meters to be cut in 0,5 m. long sections for storage; (2) a volatile material sample, to be gathered at the bottom of the core sample hole; and (3) a surface material sample, gathered from one or more locations on the surface. These samples will have to be placed in a storage canister in the capsule (to be returned on Earth) and preserved therein at a temperature not higher than 160 k. If on board sensing instrumentation identified comet nucleus features not allowing a safe landing, a back-up system based on a 'harpoon" sampler, would be launched from the spacecraft hovering the comet, and recovered via a tether line; degraded sample quality would be accepted in this case (no surface and volatile samples and limited core sampling depth).

  5. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  6. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  7. Processes Involved in Acquisition of Cognitive Skills.

    ERIC Educational Resources Information Center

    Christensen, Carol A.; Bain, John

    Processes involved in the acquisition of cognitive skills were studied through an investigation of the efficacy of initially encoding knowledge of a cognitive skill in either declarative or procedural form. Subjects were 80 university students. The cognitive skill, learning the steps to program a simulated video cassette recorder (VCR), was taught…

  8. Rotary Percussive Sample Acquisition Tool (SAT): Hardware Development and Testing

    NASA Technical Reports Server (NTRS)

    Klein, Kerry; Badescu, Mircea; Haddad, Nicolas; Shiraishi, Lori; Walkemeyer, Phillip

    2012-01-01

    In support of a potential Mars Sample Return (MSR) mission an Integrated Mars Sample Acquisition and Handling (IMSAH) architecture has been proposed to provide a means for Rover-based end-to-end sample capture and caching. A key enabling feature of the architecture is the use of a low mass sample Acquisition Tool (SAT) that is capable of drilling and capturing rock cores directly within a sample tube in order to maintain sample integrity and prevent contamination across the sample chain. As such, this paper will describe the development and testing of a low mass rotary percussive SAT that has been shown to provide a means for core generation, fracture, and capture.

  9. Information Acquisition & Processing in Scanning Probe Microscopy

    SciTech Connect

    Kalinin, Sergei V; Jesse, Stephen; Proksch, Roger

    2008-01-01

    Much of the imaging and spectroscopy capabilities of the existing 20,000+ scanning probe microscopes worldwide relies on specialized data processing that links the microsecond (and sometimes faster) time scale of cantilever motion to the millisecond (and sometimes slower) time scale of image acquisition and feedback. In most SPMs, the cantilever is excited to oscillate sinusoidally and the time-averaged amplitude and/or phase are used as imaging or control signals. Traditionally, the step of converting the rapid motion of the cantilever into an amplitude or phase is performed by phase sensitive homodyne or phase-locked loop detection. The emergence of fast configurable data processing electronics in last several years has allowed the development of non-sinusoidal data acquisition and processing methods. Here, we briefly review the principles and limitations of phase sensitive detectors and discuss some of the emergent technologies based on rapid spectroscopic measurements in frequency- and time domains.

  10. Acquisition of data by whole sample enrichment, real-time polymerase chain reaction for development of a process risk model for Salmonella and chicken parts

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Process risk models predict consumer exposure and response to pathogens in food produced by specific scenarios. A process risk model for Salmonella and chicken parts was developed that consisted of four unit operations (pathogen events): 1) meal preparation (contamination); 2) cooking (death); 3) s...

  11. Acquisition and Retaining Granular Samples via a Rotating Coring Bit

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Badescu, Mircea; Sherrit, Stewart

    2013-01-01

    This device takes advantage of the centrifugal forces that are generated when a coring bit is rotated, and a granular sample is entered into the bit while it is spinning, making it adhere to the internal wall of the bit, where it compacts itself into the wall of the bit. The bit can be specially designed to increase the effectiveness of regolith capturing while turning and penetrating the subsurface. The bit teeth can be oriented such that they direct the regolith toward the bit axis during the rotation of the bit. The bit can be designed with an internal flute that directs the regolith upward inside the bit. The use of both the teeth and flute can be implemented in the same bit. The bit can also be designed with an internal spiral into which the various particles wedge. In another implementation, the bit can be designed to collect regolith primarily from a specific depth. For that implementation, the bit can be designed such that when turning one way, the teeth guide the regolith outward of the bit and when turning in the opposite direction, the teeth will guide the regolith inward into the bit internal section. This mechanism can be implemented with or without an internal flute. The device is based on the use of a spinning coring bit (hollow interior) as a means of retaining granular sample, and the acquisition is done by inserting the bit into the subsurface of a regolith, soil, or powder. To demonstrate the concept, a commercial drill and a coring bit were used. The bit was turned and inserted into the soil that was contained in a bucket. While spinning the bit (at speeds of 600 to 700 RPM), the drill was lifted and the soil was retained inside the bit. To prove this point, the drill was turned horizontally, and the acquired soil was still inside the bit. The basic theory behind the process of retaining unconsolidated mass that can be acquired by the centrifugal forces of the bit is determined by noting that in order to stay inside the interior of the bit, the

  12. Major system acquisitions process (A-109)

    NASA Technical Reports Server (NTRS)

    Saric, C.

    1991-01-01

    The Major System examined is a combination of elements (hardware, software, facilities, and services) that function together to produce capabilities required to fulfill a mission need. The system acquisition process is a sequence of activities beginning with documentation of mission need and ending with introduction of major system into operational use or otherwise successful achievement of program objectives. It is concluded that the A-109 process makes sense and provides a systematic, integrated management approach along with appropriate management level involvement and innovative and 'best ideas' from private sector in satisfying mission needs.

  13. IWTU Process Sample Analysis Report

    SciTech Connect

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  14. 28. Perimeter acquisition radar building room #302, signal process and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. Perimeter acquisition radar building room #302, signal process and analog receiver room - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  15. Subsurface Samples: Collection and Processing

    SciTech Connect

    Long, Philip E.; Griffin, W. Timothy; Phelps, Tommy J.

    2002-12-01

    Microbiological data, interpretation, and conclusions from subsurface samples ultimately depend on the quality and representative character of the samples. Subsurface samples for environmental microbiology ideally contain only the microbial community and geochemical properties that are representative of the subsurface environment from which the sample was taken. To that end, sample contamination by exogenous microorganisms or chemical constituents must be eliminated or minimized, and sample analyses need to begin before changes in the microbial community or geochemical characteristics occur. This article presents sampling methods and sample processing techniques for collecting representative samples from a range of subsurface environments. Factors that should be considered when developing a subsurface sampling program are discussed, including potential benefits, costs, and limitations enabling researchers to evaluate the techniques that are presented and match them to their project requirements. Methods and protocols to address coring, sampling, processing and quality assessment issues are presented.

  16. Active sample acquisition system for micro-penetrators

    NASA Astrophysics Data System (ADS)

    Voorhees, Chris; Potsaid, Benjamin

    1998-05-01

    This paper summarizes the design and development of a sub-surface sample acquisition system for use in micro-penetrators. The system was developed for flight use under NASA's New Millennium Program, Deep Space 2 project. The system goal is to acquire approximately 100 mg of Martian sub-surface soil and return it to the inside of the micro-penetrator for analysis to determine the presence of water. Various passive and active sampling techniques that were tested during the development cycle are described. After significant testing, a side bore drill mechanism was chosen to be developed for use in the flight penetrators. The design, development, and testing of mechanism element are outlined, with particular emphasis placed on actuator development, drill stem design, impact testing, and mechanism testing in various soil types. The other system elements, a pyrotechnically actuated door mechanism to seal the sample and an impact restraint mechanism, are also described.

  17. Cryogenic Liquid Sample Acquisition System for Remote Space Applications

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul; Trainer, Melissa; Wegel, Don; Hawk, Douglas; Melek, Tony; Johnson, Christopher; Amato, Michael; Galloway, John

    2013-01-01

    There is a need to acquire autonomously cryogenic hydrocarbon liquid sample from remote planetary locations such as the lakes of Titan for instruments such as mass spectrometers. There are several problems that had to be solved relative to collecting the right amount of cryogenic liquid sample into a warmer spacecraft, such as not allowing the sample to boil off or fractionate too early; controlling the intermediate and final pressures within carefully designed volumes; designing for various particulates and viscosities; designing to thermal, mass, and power-limited spacecraft interfaces; and reducing risk. Prior art inlets for similar instruments in spaceflight were designed primarily for atmospheric gas sampling and are not useful for this front-end application. These cryogenic liquid sample acquisition system designs for remote space applications allow for remote, autonomous, controlled sample collections of a range of challenging cryogenic sample types. The design can control the size of the sample, prevent fractionation, control pressures at various stages, and allow for various liquid sample levels. It is capable of collecting repeated samples autonomously in difficult lowtemperature conditions often found in planetary missions. It is capable of collecting samples for use by instruments from difficult sample types such as cryogenic hydrocarbon (methane, ethane, and propane) mixtures with solid particulates such as found on Titan. The design with a warm actuated valve is compatible with various spacecraft thermal and structural interfaces. The design uses controlled volumes, heaters, inlet and vent tubes, a cryogenic valve seat, inlet screens, temperature and cryogenic liquid sensors, seals, and vents to accomplish its task.

  18. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  19. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  20. Autonomous sample selection and acquisition for planetary exploration

    NASA Astrophysics Data System (ADS)

    Pugh, S.; Barnes, D.

    2007-08-01

    Thanks to rapid advances in planetary robotics and scientific instruments, data can now be gathered on the surface of Mars far quicker than can be successfully relayed to Earth. Pauses in activity have to be introduced, so that this additional data can be transmitted back to Earth. These pauses represent an inefficiency in the overall mission value, as value is calculated by dividing the amount of useful scientific data returned by the cost. Scientific data with little value represents a waste of valuable transmission time and thus reduces the overall cost efficiency of the mission. Pauses in activity also have to be introduced during periods when communication with Earth is impossible. A great deal of research is currently being undertaken to try to limit this time wastage through the utilization of an autonomous sample selection and acquisition system. Such a system could initially select high value science targets and then continue its exploration while only useful data is being relayed to Earth for further study, thus increasing mission value. This paper will present a review of the research currently being undertaken to aid in producing an autonomous sample selection and acquisition system for planetary exploration.

  1. A Risk Management Model for the Federal Acquisition Process.

    DTIC Science & Technology

    1999-06-01

    risk management in the acquisition process. This research explains the Federal Acquisition Process and each of the 78 tasks to be completed by the CO...and examines the concepts of risk and risk management . This research culminates in the development of a model that identifies prevalent risks in the...contracting professionals is used to gather opinions, ideas, and practical applications of risk management in the acquisition process, and refine the model

  2. Collecting cometary soil samples? Development of the ROSETTA sample acquisition system

    NASA Technical Reports Server (NTRS)

    Coste, P. A.; Fenzi, M.; Eiden, Michael

    1993-01-01

    In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.

  3. Acquisition by Processing: A Modular Perspective on Language Development

    ERIC Educational Resources Information Center

    Truscott, John; Smith, Mike Sharwood

    2004-01-01

    The paper offers a model of language development, first and second, within a processing perspective. We first sketch a modular view of language, in which competence is embodied in the processing mechanisms. We then propose a novel approach to language acquisition (Acquisition by Processing Theory, or APT), in which development of the module occurs…

  4. Feasibility Study of Commercial Markets for New Sample Acquisition Devices

    NASA Technical Reports Server (NTRS)

    Brady, Collin; Coyne, Jim; Bilen, Sven G.; Kisenwether, Liz; Miller, Garry; Mueller, Robert P.; Zacny, Kris

    2010-01-01

    The NASA Exploration Systems Mission Directorate (ESMD) and Penn State technology commercialization project was designed to assist in the maturation of a NASA SBIR Phase III technology. The project was funded by NASA's ESMD Education group with oversight from the Surface Systems Office at NASA Kennedy Space Center in the Engineering Directorate. Two Penn State engineering student interns managed the project with support from Honeybee Robotics and NASA Kennedy Space Center. The objective was to find an opportunity to integrate SBIR-developed Regolith Extractor and Sampling Technology as the payload for the future Lunar Lander or Rover missions. The team was able to identify two potential Google Lunar X Prize organizations with considerable interest in utilizing regolith acquisition and transfer technology.

  5. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  6. Dynamic Acquisition and Retrieval Tool (DART) for Comet Sample Return : Session: 2.06.Robotic Mobility and Sample Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Bonitz, Robert; Kulczycki, Erick; Aisen, Norman; Dandino, Charles M.; Cantrell, Brett S.; Gallagher, William; Shevin, Jesse; Ganino, Anthony; Haddad, Nicolas; Walkemeyer, Phillip; Backes, Paul; Shiraishi, Lori

    2013-01-01

    The 2011 Decadal Survey for planetary science released by the National Research Council of the National Academies identified Comet Surface Sample Return (CSSR) as one of five high priority potential New Frontiers-class missions in the next decade. The main objectives of the research described in this publication are: develop a concept for an end-to-end system for collecting and storing a comet sample to be returned to Earth; design, fabricate and test a prototype Dynamic Acquisition and Retrieval Tool (DART) capable of collecting 500 cc sample in a canister and eject the canister with a predetermined speed; identify a set of simulants with physical properties at room temperature that suitably match the physical properties of the comet surface as it would be sampled. We propose the use of a dart that would be launched from the spacecraft to impact and penetrate the comet surface. After collecting the sample, the sample canister would be ejected at a speed greater than the comet's escape velocity and captured by the spacecraft, packaged into a return capsule and returned to Earth. The dart would be composed of an inner tube or sample canister, an outer tube, a decelerator, a means of capturing and retaining the sample, and a mechanism to eject the canister with the sample for later rendezvous with the spacecraft. One of the significant unknowns is the physical properties of the comet surface. Based on new findings from the recent Deep Impact comet encounter mission, we have limited our search of solutions for sampling materials to materials with 10 to 100 kPa shear strength in loose or consolidated form. As the possible range of values for the comet surface temperature is also significantly different than room temperature and testing at conditions other than the room temperature can become resource intensive, we sought sample simulants with physical properties at room temperature similar to the expected physical properties of the comet surface material. The chosen

  7. Self-adaptive sampling rate data acquisition in JET's correlation reflectometer

    SciTech Connect

    Arcas, G. de; Lopez, J. M.; Ruiz, M.; Barrera, E.; Vega, J.; Fonseca, A.

    2008-10-15

    Data acquisition systems with self-adaptive sampling rate capabilities have been proposed as a solution to reduce the shear amount of data collected in every discharge of present fusion devices. This paper discusses the design of such a system for its use in the KG8B correlation reflectometer at JET. The system, which is based on the ITMS platform, continuously adapts the sample rate during the acquisition depending on the signal bandwidth. Data are acquired continuously at the expected maximum sample rate and transferred to a memory buffer in the host processor. Thereafter the rest of the process is based on software. Data are read from the memory buffer in blocks and for each block an intelligent decimation algorithm is applied. The decimation algorithm determines the signal bandwidth for each block in order to choose the optimum sample rate for that block, and from there the decimation factor to be used. Memory buffers are used to adapt the throughput of the three main software modules (data acquisition, processing, and storage) following a typical producer-consumer architecture. The system optimizes the amount of data collected while maintaining the same information. Design issues are discussed and results of performance evaluation are presented.

  8. Electro-Optic Data Acquisition and Processing.

    DTIC Science & Technology

    Methods for the analysis of electro - optic relaxation data are discussed. Emphasis is on numerical methods using high speed computers. A data acquisition system using a minicomputer for data manipulation is described. Relationship of the results obtained here to other possible uses is given. (Author)

  9. Dual Learning Processes in Interactive Skill Acquisition

    ERIC Educational Resources Information Center

    Fu, Wai-Tat; Anderson, John R.

    2008-01-01

    Acquisition of interactive skills involves the use of internal and external cues. Experiment 1 showed that when actions were interdependent, learning was effective with and without external cues in the single-task condition but was effective only with the presence of external cues in the dual-task condition. In the dual-task condition, actions…

  10. Implicit and Explicit Cognitive Processes in Incidental Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Ender, Andrea

    2016-01-01

    Studies on vocabulary acquisition in second language learning have revealed that a large amount of vocabulary is learned without an overt intention, in other words, incidentally. This article investigates the relevance of different lexical processing strategies for vocabulary acquisition when reading a text for comprehension among 24 advanced…

  11. System of acquisition and processing of images of dynamic speckle

    NASA Astrophysics Data System (ADS)

    Vega, F.; >C Torres,

    2015-01-01

    In this paper we show the design and implementation of a system to capture and analysis of dynamic speckle. The device consists of a USB camera, an isolated system lights for imaging, a laser pointer 633 nm 10 mw as coherent light source, a diffuser and a laptop for processing video. The equipment enables the acquisition and storage of video, also calculated of different descriptors of statistical analysis (vector global accumulation of activity, activity matrix accumulation, cross-correlation vector, autocorrelation coefficient, matrix Fujji etc.). The equipment is designed so that it can be taken directly to the site where the sample for biological study and is currently being used in research projects within the group.

  12. FY 79 Software Acquisition Process Model Task. Revision 1

    DTIC Science & Technology

    1980-07-01

    This final report on the FY 79 Project 5220 Software Acquisition Process Model Task (522F) presents the approach taken to process model definition...plan for their incorporation and application in successive process model versions. The report contains diagrams that represent the Full-Scale

  13. Input and Input Processing in Second Language Acquisition.

    ERIC Educational Resources Information Center

    Alcon, Eva

    1998-01-01

    Analyzes second-language learners' processing of linguistic data within the target language, focusing on input and intake in second-language acquisition and factors and cognitive processes that affect input processing. Input factors include input simplification, input enhancement, and interactional modifications. Individual learner differences…

  14. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  15. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  16. Finding Discipline in an Agile Acquisition Process

    DTIC Science & Technology

    2011-05-18

    of technology to deployment Documentation of processes with compliance audits i th t f ll d• ensur ng a processes are o owe Financial performance...deployment Deltas • use case deferrals, shortfalls, test deficiencies are in domain-relevant language of end users and decisions makers – avoids...Bottom Line When we speak of discipline, we are advocating the creation of a more disciplined mechanism (structures + processes) to: • describe user

  17. Sample Acquisition and Caching architecture for the Mars Sample Return mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Chu, P.; Cohen, J.; Paulsen, G.; Craft, J.; Szwarc, T.

    This paper presents a Mars Sample Return (MSR) Sample Acquisition and Caching (SAC) study developed for the three rover platforms: MER, MER+, and MSL. The study took into account 26 SAC requirements provided by the NASA Mars Exploration Program Office. For this SAC architecture, the reduction of mission risk was chosen by us as having greater priority than mass or volume. For this reason, we selected a “ One Bit per Core” approach. The enabling technology for this architecture is Honeybee Robotics' “ eccentric tubes” core breakoff approach. The breakoff approach allows the drill bits to be relatively small in diameter and in turn lightweight. Hence, the bits could be returned to Earth with the cores inside them with only a modest increase to the total returned mass, but a significant decrease in complexity. Having dedicated bits allows a reduction in the number of core transfer steps and actuators. It also alleviates the bit life problem, eliminates cross contamination, and aids in hermetic sealing. An added advantage is faster drilling time, lower power, lower energy, and lower Weight on Bit (which reduces Arm preload requirements). Drill bits are based on the BigTooth bit concept, which allows re-use of the same bit multiple times, if necessary. The proposed SAC consists of a 1) Rotary-Percussive Core Drill, 2) Bit Storage Carousel, 3) Cache, 4) Robotic Arm, and 5) Rock Abrasion and Brushing Bit (RABBit), which is deployed using the Drill. The system also includes PreView bits (for viewing of cores prior to caching) and Powder bits for acquisition of regolith or cuttings. The SAC total system mass is less than 22 kg for MER and MER+ size rovers and less than 32 kg for the MSL-size rover.

  18. ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS

    EPA Science Inventory

    R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...

  19. Compressive Video Acquisition, Fusion and Processing

    DTIC Science & Technology

    2010-12-14

    different views of the independent motions of 2 toy koalas along individual 1-D paths, yielding a 2-D combined parameter space. This data suffers...from real-world artifacts such as fluctuations in illumination conditions and variations in the pose of the koalas ; further, the koalas occlude one...1 Camera 2 Camera 3 Camera 4 Joint manifold R aw im a g es R a n d o m p ro je ct io n s Figure 38: (top) Sample images of 2 koalas moving along

  20. Auditory Processing Disorders: Acquisition and Treatment

    ERIC Educational Resources Information Center

    Moore, David R.

    2007-01-01

    Auditory processing disorder (APD) describes a mixed and poorly understood listening problem characterised by poor speech perception, especially in challenging environments. APD may include an inherited component, and this may be major, but studies reviewed here of children with long-term otitis media with effusion (OME) provide strong evidence…

  1. Acquisitions. ERIC Processing Manual, Section II.

    ERIC Educational Resources Information Center

    Sundstrom, Grace, Ed.

    Rules and guidelines are provided for the process of acquiring documents to be considered for inclusion in the ERIC database. The differing responsibilities of the Government, the ERIC Clearinghouses, and the ERIC Facility are delineated. The various methods by which documentary material can be obtained are described and preferences outlined.…

  2. An Auditable Performance Based Software Acquisition Process

    DTIC Science & Technology

    2010-04-28

    All Rights Reserved SPG SSTC 2010 Inspections - Peer Reviews • Over time, each term has become ambiguous • Many times the two terms are used...interchangeably Stewart-Priven believe: • Inspections are a rigorous form of Peer Reviews • Peer Reviews are not necessarily Inspections – Peer ...Inspection tool reports for process conformance and Computerized Inspection Tools 2 Perform gap analysis and map project’s Inspection (or Peer - Review ) capabilities

  3. Budgeting and Acquisition Business Process Reform

    DTIC Science & Technology

    2007-11-07

    reform issues . He has authored more than one hundred journal articles and book chapters on topics including national defense budgeting and policy...programming and budgeting cycles, while still preserving the decisions made in the on-year cycle through the off-year by limiting reconsideration of...POM were eliminated and replaced by a process of longer-term budgeting. In traditional budgeting, budget submitting offices (BSOs) have to answer

  4. SALTSTONE PROCESSING FACILITY TRANSFER SAMPLE

    SciTech Connect

    Cozzi, A.; Reigel, M.

    2010-08-04

    On May 19, 2010, the Saltstone Production Facility inadvertently transferred 1800 gallons of untreated waste from the salt feed tank to Vault 4. During shut down, approximately 70 gallons of the material was left in the Saltstone hopper. A sample of the slurry in the hopper was sent to Savannah River National Laboratory (SRNL) to analyze the density, pH and the eight Resource Conservation and Recovery Act (RCRA) metals. The sample was hazardous for chromium, mercury and pH. The sample received from the Saltstone hopper was analyzed visually while obtaining sample aliquots and while the sample was allowed to settle. It was observed that the sample contains solids that settle in approximately 20 minutes (Figure 3-1). There is a floating layer on top of the supernate during settling and disperses when the sample is agitated (Figure 3-2). The untreated waste inadvertently transferred from the SFT to Vault 4 was toxic for chromium and mercury. In addition, the pH of the sample is at the regulatory limit. Visually inspecting the sample indicates solids present in the sample.

  5. 75 FR 62069 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... Federal Acquisition Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General..., Prohibition on contracting with entities that conduct restricted business operations in Sudan, to add specific... on awarding a contract to a contractor that conducts business in Sudan should be waived....

  6. Reading Acquisition Enhances an Early Visual Process of Contour Integration

    ERIC Educational Resources Information Center

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched…

  7. Low Cost Coherent Doppler Lidar Data Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce W.; Koch, Grady J.

    2003-01-01

    The work described in this paper details the development of a low-cost, short-development time data acquisition and processing system for a coherent Doppler lidar. This was done using common laboratory equipment and a small software investment. This system provides near real-time wind profile measurements. Coding flexibility created a very useful test bed for new techniques.

  8. Thermal mapping and trends of Mars analog materials in sample acquisition operations using experimentation and models

    NASA Astrophysics Data System (ADS)

    Szwarc, Timothy; Hubbard, Scott

    2014-09-01

    The effects of atmosphere, ambient temperature, and geologic material were studied experimentally and using a computer model to predict the heating undergone by Mars rocks during rover sampling operations. Tests were performed on five well-characterized and/or Mars analog materials: Indiana limestone, Saddleback basalt, kaolinite, travertine, and water ice. Eighteen tests were conducted to 55 mm depth using a Mars Sample Return prototype coring drill, with each sample containing six thermal sensors. A thermal simulation was written to predict the complete thermal profile within each sample during coring and this model was shown to be capable of predicting temperature increases with an average error of about 7%. This model may be used to schedule power levels and periods of rest during actual sample acquisition processes to avoid damaging samples or freezing the bit into icy formations. Maximum rock temperature increase is found to be modeled by a power law incorporating rock and operational parameters. Energy transmission efficiency in coring is found to increase linearly with rock hardness and decrease by 31% at Mars pressure.

  9. Science of Test Measurement Accuracy - Data Sampling and Filter Selection during Data Acquisition

    DTIC Science & Technology

    2015-06-01

    sampling rate, aliasing, filtering, butterworth, chebyshev, Bessel, PSD and Bode plots 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION...and PSD plots . SCIENCE OF TEST Measurement Accuracy - Data Sampling and Filter Selection during Data Acquisition David Kidman Air...Use both Bode and PSD plots to evaluate filter and sample rate effects prior to implementation Summary Reference: The Scientist and Engineer’s

  10. Evaluation of high performance data acquisition boards for simultaneous sampling of fast signals from PET detectors

    NASA Astrophysics Data System (ADS)

    Judenhofer, Martin S.; Pichler, Bernd J.; Cherry, Simon R.

    2005-01-01

    Detectors used for positron emission tomography (PET) provide fast, randomly distributed signals that need to be digitized for further processing. One possibility is to sample the signals at the peak initiated by a trigger from a constant fraction discriminator (CFD). For PET detectors, simultaneous acquisition of many channels is often important. To develop and evaluate novel PET detectors, a flexible, relatively low cost and high performance laboratory data acquisition (DAQ) system is therefore required. The use of dedicated DAQ systems, such as a multi-channel analysers (MCAs) or continuous sampling boards at high rates, is expensive. This work evaluates the suitability of well-priced peripheral component interconnect (PCI)-based 8-channel DAQ boards (PD2-MFS-8 2M/14 and PD2-MFS-8-500k/14, United Electronic Industries Inc., Canton, MA, USA) for signal acquisition from novel PET detectors. A software package was developed to access the board, measure basic board parameters, and to acquire, visualize, and analyse energy spectra and position profiles from block detectors. The performance tests showed that the boards input linearity is >99.2% and the standard deviation is <9 mV at 10 V for constant signals. Synchronous sampling of multiple channels and external synchronization of more boards are possible at rates up to 240 kHz per channel. Signals with rise times as fast as 130 ns (<2 V amplitude) can be acquired without slew rate effects. However, for signals with amplitudes of up to 5 V, a rise time slower than 250 ns is required. The measured energy resolution of a lutetium oxyorthosilicate (LSO)-photomultiplier tube (PMT) detector with a 22Na source was 14.9% (FWHM) at 511 keV and is slightly better than the result obtained with a high-end single channel MCA (8000A, Amptek, USA) using the same detector (16.8%). The crystals (1.2 × 1.2 × 12 mm3) within a 9 × 9 LSO block detector could be clearly separated in an acquired position profile. Thus, these boards are

  11. Evidence on the Effect of DoD Acquisition Policy and Process on Cost Growth of Major Defense Acquisition Programs

    DTIC Science & Technology

    2015-05-13

    Evidence on the Effect of DoD Acquisition Policy and Process on Cost Growth of Major Defense Acquisition Programs Naval Postgraduate School 12th...TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Evidence on the Effect of DoD Acquisition Policy and Process on Cost Growth...There are no obvious candidates.  The paper (Appendix B) provides evidence that PAUC growth is not systematically influenced by changes in budget

  12. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  13. The CINCS (Commanders-in-Chief) and the Acquisition Process

    DTIC Science & Technology

    1988-09-01

    weight to joint views in the PPB and acquisition processes. The current Joint Program Analysis Memorandum (JPAM) is considered in OSD as a pulling...the senior warfighters who can best judge the ultimate use of weapons. In the final analysis . that acceptance should rest on the capabilities of the...program could be a restructured Joint Program Analysis Memorandum which could serve as the Chairman’s Program Assessment Memorandum. Short of this

  14. Autonomous site selection and instrument positioning for sample acquisition

    NASA Astrophysics Data System (ADS)

    Shaw, A.; Barnes, D.; Pugh, S.

    The European Space Agency Aurora Exploration Program aims to establish a European long-term programme for the exploration of Space, culminating in a human mission to space in the 2030 timeframe. Two flagship missions, namely Mars Sample Return and ExoMars, have been proposed as recognised steps along the way. The Exomars Rover is the first of these flagship missions and includes a rover carrying the Pasteur Payload, a mobile exobiology instrumentation package, and the Beagle 2 arm. The primary objective is the search for evidence of past or present life on mars, but the payload will also study the evolution of the planet and the atmosphere, look for evidence of seismological activity and survey the environment in preparation for future missions. The operation of rovers in unknown environments is complicated, and requires large resources not only on the planet but also in ground based operations. Currently, this can be very labour intensive, and costly, if large teams of scientists and engineers are required to assess mission progress, plan mission scenarios, and construct a sequence of events or goals for uplink. Furthermore, the constraints in communication imposed by the time delay involved over such large distances, and line-of-sight required, make autonomy paramount to mission success, affording the ability to operate in the event of communications outages and be opportunistic with respect to scientific discovery. As part of this drive to reduce mission costs and increase autonomy the Space Robotics group at the University of Wales, Aberystwyth is researching methods of autonomous site selection and instrument positioning, directly applicable to the ExoMars mission. The site selection technique used builds on the geometric reasoning algorithms used previously for localisation and navigation [Shaw 03]. It is proposed that a digital elevation model (DEM) of the local surface, generated during traverse and without interaction from ground based operators, can be

  15. Sampling and sample processing in pesticide residue analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Proper sampling and sample processing in pesticide residue analysis of food and soil has always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agroc...

  16. Mosaic acquisition and processing for optical-resolution photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Shao, Peng; Shi, Wei; Chee, Ryan K. W.; Zemp, Roger J.

    2012-08-01

    In optical-resolution photo-acoustic microscopy (OR-PAM), data acquisition time is limited by both laser pulse repetition rate (PRR) and scanning speed. Optical-scanning offers high speed, but limited, field of view determined by ultrasound transducer sensitivity. In this paper, we propose a hybrid optical and mechanical-scanning OR-PAM system with mosaic data acquisition and processing. The system employs fast-scanning mirrors and a diode-pumped, nanosecond-pulsed, Ytterbium-doped, 532-nm fiber laser with PRR up to 600 kHz. Data from a sequence of image mosaic patches is acquired systematically, at predetermined mechanical scanning locations, with optical scanning. After all imaging locations are covered, a large panoramic scene is generated by stitching the mosaic patches together. Our proposed system is proven to be at least 20 times faster than previous reported OR-PAM systems.

  17. The logical syntax of number words: theory, acquisition and processing.

    PubMed

    Musolino, Julien

    2009-04-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language

  18. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, John E.; Dinsmore, Stanley R.; Chandler, Edward W.

    1986-01-01

    A four-port disc valve for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of .alpha. silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions.

  19. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  20. PET/CT for radiotherapy: image acquisition and data processing.

    PubMed

    Bettinardi, V; Picchio, M; Di Muzio, N; Gianolli, L; Messa, C; Gilardi, M C

    2010-10-01

    This paper focuses on acquisition and processing methods in positron emission tomography/computed tomography (PET/CT) for radiotherapy (RT) applications. The recent technological evolutions of PET/CT systems are described. Particular emphasis is dedicated to the tools needed for the patient positioning and immobilization, to be used in PET/CT studies as well as during RT treatment sessions. The effect of organ and lesion motion due to patient's respiration on PET/CT imaging is discussed. Breathing protocols proposed to minimize PET/CT spatial mismatches in relation to respiratory movements are illustrated. The respiratory gated (RG) 4D-PET/CT techniques, developed to measure and compensate for organ and lesion motion, are then introduced. Finally a description is provided of different acquisition and data processing techniques, implemented with the aim at improving: i) image quality and quantitative accuracy of PET images, and ii) target volume definition and treatment planning in RT, by using specific and personalised motion information.

  1. Reading acquisition enhances an early visual process of contour integration.

    PubMed

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched in age, socioeconomic and cultural characteristics, on a contour integration task known to depend on early visual processing. Stimuli consisted of a closed egg-shaped contour made of disconnected Gabor patches, within a background of randomly oriented Gabor stimuli. Subjects had to decide whether the egg was pointing left or right. Difficulty was varied by jittering the orientation of the Gabor patches forming the contour. Contour integration performance was lower in illiterates than in both ex-illiterate and literate controls. We argue that this difference in contour perception must reflect a genuine difference in visual function. According to this view, the intensive perceptual training that accompanies reading acquisition also improves early visual abilities, suggesting that the impact of literacy on the visual system is more widespread than originally proposed.

  2. Directed Sample Interrogation Utilizing an Accurate Mass Exclusion-Based Data-Dependent Acquisition Strategy (AMEx)

    PubMed Central

    Rudomin, Emily L.; Carr, Steven A.; Jaffe, Jacob D.

    2009-01-01

    The ability to perform thorough sampling is of critical importance when using mass spectrometry to characterize complex proteomic mixtures. A common approach is to re-interrogate a sample multiple times by LC-MS/MS. However, the conventional data-dependent acquisition methods that are typically used in proteomics studies will often redundantly sample high-intensity precursor ions while failing to sample low-intensity precursors entirely. We describe a method wherein the masses of successfully identified peptides are used to generate an accurate mass exclusion list such that those precursors are not selected for sequencing during subsequent analyses. We performed multiple concatenated analytical runs to sample a complex cell lysate, using either accurate mass exclusion-based data-dependent acquisition (AMEx) or standard data-dependent acquisition, and found that utilization of AMEx on an ESI-Orbitrap instrument significantly increases the total number of validated peptide identifications relative to a standard DDA approach. The additional identified peptides represent precursor ions that exhibit low signal intensity in the sample. Increasing the total number of peptide identifications augmented the number of proteins identified, as well as improved the sequence coverage of those proteins. Together, these data indicate that using AMEx is an effective strategy to improve the characterization of complex proteomic mixtures. PMID:19344186

  3. Directed sample interrogation utilizing an accurate mass exclusion-based data-dependent acquisition strategy (AMEx).

    PubMed

    Rudomin, Emily L; Carr, Steven A; Jaffe, Jacob D

    2009-06-01

    The ability to perform thorough sampling is of critical importance when using mass spectrometry to characterize complex proteomic mixtures. A common approach is to reinterrogate a sample multiple times by LC-MS/MS. However, the conventional data-dependent acquisition methods that are typically used in proteomics studies will often redundantly sample high-intensity precursor ions while failing to sample low-intensity precursors entirely. We describe a method wherein the masses of successfully identified peptides are used to generate an accurate mass exclusion list such that those precursors are not selected for sequencing during subsequent analyses. We performed multiple concatenated analytical runs to sample a complex cell lysate, using either accurate mass exclusion-based data-dependent acquisition (AMEx) or standard data-dependent acquisition, and found that utilization of AMEx on an ESI-Orbitrap instrument significantly increases the total number of validated peptide identifications relative to a standard DDA approach. The additional identified peptides represent precursor ions that exhibit low signal intensity in the sample. Increasing the total number of peptide identifications augmented the number of proteins identified, as well as improved the sequence coverage of those proteins. Together, these data indicate that using AMEx is an effective strategy to improve the characterization of complex proteomic mixtures.

  4. Predator Acquisition Program Transition from Rapid to Standard Processes

    DTIC Science & Technology

    2012-06-08

    the first Advanced Concept Technology Demonstration to transition into the Defense Acquisition System. When it did, it operated within the Air Force’s...Predator became the first Advanced Concept Technology Demonstration to transition into the Defense Acquisition System. When it did, it operated within...89 viii ACRONYMS ACTD Advanced Concept Technology Demonstration AF Air Force DAB Defense Acquisition Board DARO Defense

  5. A needle-free technique for interstitial fluid sample acquisition using a lorentz-force actuated jet injector.

    PubMed

    Chang, Jean H; Hogan, N Catherine; Hunter, Ian W

    2015-08-10

    We present a novel method of quickly acquiring dermal interstitial fluid (ISF) samples using a Lorentz-force actuated needle-free jet injector. The feasibility of the method is first demonstrated on post-mortem porcine tissue. The jet injector is used to first inject a small volume of physiological saline to breach the skin, and the back-drivability of the actuator is utilized to create negative pressure in the ampoule and collect ISF. The effect of the injection and extraction parameters on sample dilution and extracted volumes is investigated. A simple finite element model is developed to demonstrate why this acquisition method results in faster extractions than conventional sampling methods. Using this method, we are able to collect a sample that contains up to 3.5% ISF in 3.1s from post-mortem skin. The trends revealed from experimentation on post-mortem skin are then used to identify the parameters for a live animal study. The feasibility of the acquisition process is successfully demonstrated using live rats; the process is revealed to extract samples that have been diluted by a factor of 111-125.

  6. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  7. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1984-08-16

    This is a patent for a disc-type, four-port sampling valve for service with erosive high temperature process streams. Inserts and liners of ..cap alpha..-silicon carbide respectively, in the faceplates and in the sampling cavities, limit erosion while providing lubricity for a smooth and precise operation. 1 fig.

  8. Experimental Determination of GPR Groundwave Sampling Depth as a Function of Data Acquisition Parameters

    NASA Astrophysics Data System (ADS)

    Crist, T. L.; Benda, A.; Grote, K. R.

    2010-12-01

    , then the plot was sprinkled with water for one hour, and more GPR data were acquired. GPR data acquisition continued to be interspersed with sprinkling over the plot area in one hour increments until the TDR probes no longer detected a change in soil moisture with time and the soil appeared to be saturated. The soil was then allowed to dry, and GPR data were acquired periodically throughout the drying period. GPR data were collected using an adjustable sled system and multi-channel adapter which allowed simultaneous data acquisition with four frequencies. The four frequencies were pulled in common-offset mode along a traverse in the center of the plot area. Then, the antenna offset was changed for each frequency, and the traverse was repeated. A total of five different antenna offsets for each frequency was used each time GPR data were acquired. Data processing is ongoing, but preliminary comparisons of the data acquired with different frequencies and antenna offsets confirm that the sampling depth is frequency dependent under both wetting and drying conditions and show that antenna separation does not appear to significantly affect the sampling depth.

  9. Disc valve for sampling erosive process streams

    DOEpatents

    Mrochek, J.E.; Dinsmore, S.R.; Chandler, E.W.

    1986-01-07

    A four-port disc valve is described for sampling erosive, high temperature process streams. A rotatable disc defining opposed first and second sampling cavities rotates between fired faceplates defining flow passageways positioned to be alternatively in axial alignment with the first and second cavities. Silicon carbide inserts and liners composed of [alpha] silicon carbide are provided in the faceplates and in the sampling cavities to limit erosion while providing lubricity for a smooth and precise operation when used under harsh process conditions. 1 fig.

  10. A prototype data acquisition and processing system for Schumann resonance measurements

    NASA Astrophysics Data System (ADS)

    Tatsis, Giorgos; Votis, Constantinos; Christofilakis, Vasilis; Kostarakis, Panos; Tritakis, Vasilis; Repapis, Christos

    2015-12-01

    In this paper, a cost-effective prototype data acquisition system specifically designed for Schumann resonance measurements and an adequate signal processing method are described in detail. The implemented system captures the magnetic component of the Schumann resonance signal, using a magnetic antenna, at much higher sampling rates than the Nyquist rate for efficient signal improvement. In order to obtain the characteristics of the individual resonances of the SR spectrum a new and efficient software was developed. The processing techniques used in this software are analyzed thoroughly in the following. Evaluation of system's performance and operation is realized using preliminary measurements taken in the region of Northwest Greece.

  11. Troubleshooting digital macro photography for image acquisition and the analysis of biological samples.

    PubMed

    Liepinsh, Edgars; Kuka, Janis; Dambrova, Maija

    2013-01-01

    For years, image acquisition and analysis have been an important part of life science experiments to ensure the adequate and reliable presentation of research results. Since the development of digital photography and digital planimetric methods for image analysis approximately 20 years ago, new equipment and technologies have emerged, which have increased the quality of image acquisition and analysis. Different techniques are available to measure the size of stained tissue samples in experimental animal models of disease; however, the most accurate method is digital macro photography with software that is based on planimetric analysis. In this study, we described the methodology for the preparation of infarcted rat heart and brain tissue samples before image acquisition, digital macro photography techniques and planimetric image analysis. These methods are useful in the macro photography of biological samples and subsequent image analysis. In addition, the techniques that are described in this study include the automated analysis of digital photographs to minimize user input and exclude the risk of researcher-generated errors or bias during image analysis.

  12. Acquisition and Post-Processing of Immunohistochemical Images.

    PubMed

    Sedgewick, Jerry

    2017-01-01

    Augmentation of digital images is almost always a necessity in order to obtain a reproduction that matches the appearance of the original. However, that augmentation can mislead if it is done incorrectly and not within reasonable limits. When procedures are in place for insuring that originals are archived, and image manipulation steps reported, scientists not only follow good laboratory practices, but avoid ethical issues associated with post processing, and protect their labs from any future allegations of scientific misconduct. Also, when procedures are in place for correct acquisition of images, the extent of post processing is minimized or eliminated. These procedures include white balancing (for brightfield images), keeping tonal values within the dynamic range of the detector, frame averaging to eliminate noise (typically in fluorescence imaging), use of the highest bit depth when a choice is available, flatfield correction, and archiving of the image in a non-lossy format (not JPEG).When post-processing is necessary, the commonly used applications for correction include Photoshop, and ImageJ, but a free program (GIMP) can also be used. Corrections to images include scaling the bit depth to higher and lower ranges, removing color casts from brightfield images, setting brightness and contrast, reducing color noise, reducing "grainy" noise, conversion of pure colors to grayscale, conversion of grayscale to colors typically used in fluorescence imaging, correction of uneven illumination (flatfield correction), merging color images (fluorescence), and extending the depth of focus. These corrections are explained in step-by-step procedures in the chapter that follows.

  13. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  14. Chapter A5. Processing of Water Samples

    USGS Publications Warehouse

    Wilde, Franceska D.; Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.

    1999-01-01

    The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.

  15. Quality evaluation of processed clay soil samples

    PubMed Central

    Steiner-Asiedu, Matilda; Harrison, Obed Akwaa; Vuvor, Frederick; Tano-Debrah, Kwaku

    2016-01-01

    Introduction This study assessed the microbial quality of clay samples sold on two of the major Ghanaian markets. Methods The study was a cross-sectional assessing the evaluation of processed clay and effects it has on the nutrition of the consumers in the political capital town of Ghana. The items for the examination was processed clay soil samples. Results Staphylococcus spp and fecal coliforms including Klebsiella, Escherichia, and Shigella and Enterobacterspp were isolated from the clay samples. Samples from the Kaneshie market in Accra recorded the highest total viable counts 6.5 Log cfu/g and Staphylococcal count 5.8 Log cfu/g. For fecal coliforms, Madina market samples had the highest count 6.5 Log cfu/g and also recorded the highest levels of yeast and mould. For Koforidua, total viable count was highest in the samples from the Zongo market 6.3 Log cfu/g. Central market samples had the highest count of fecal coliforms 4.6 Log cfu/g and yeasts and moulds 6.5 Log cfu/g. “Small” market recorded the highest staphylococcal count 6.2 Log cfu/g. The water activity of the clay samples were low, and ranged between 0.65±0.01 and 0.66±0.00 for samples collected from Koforidua and Accra respectively. Conclusion The clay samples were found to contain Klebsiella spp. Escherichia, Enterobacter, Shigella spp. staphylococcus spp., yeast and mould. These have health implications when consumed. PMID:27642456

  16. Sample Acquisition and Analytical Chemistry Challenges to Verifying Compliance to Aviators Breathing Oxygen (ABO) Purity Specification

    NASA Technical Reports Server (NTRS)

    Graf, John

    2015-01-01

    NASA has been developing and testing two different types of oxygen separation systems. One type of oxygen separation system uses pressure swing technology, the other type uses a solid electrolyte electrochemical oxygen separation cell. Both development systems have been subjected to long term testing, and performance testing under a variety of environmental and operational conditions. Testing these two systems revealed that measuring the product purity of oxygen, and determining if an oxygen separation device meets Aviator's Breathing Oxygen (ABO) specifications is a subtle and sometimes difficult analytical chemistry job. Verifying product purity of cryogenically produced oxygen presents a different set of analytical chemistry challenges. This presentation will describe some of the sample acquisition and analytical chemistry challenges presented by verifying oxygen produced by an oxygen separator - and verifying oxygen produced by cryogenic separation processes. The primary contaminant that causes gas samples to fail to meet ABO requirements is water. The maximum amount of water vapor allowed is 7 ppmv. The principal challenge of verifying oxygen produced by an oxygen separator is that it is produced relatively slowly, and at comparatively low temperatures. A short term failure that occurs for just a few minutes in the course of a 1 week run could cause an entire tank to be rejected. Continuous monitoring of oxygen purity and water vapor could identify problems as soon as they occur. Long term oxygen separator tests were instrumented with an oxygen analyzer and with an hygrometer: a GE Moisture Monitor Series 35. This hygrometer uses an aluminum oxide sensor. The user's manual does not report this, but long term exposure to pure oxygen causes the aluminum oxide sensor head to bias dry. Oxygen product that exceeded the 7 ppm specification was improperly accepted, because the sensor had biased. The bias is permanent - exposure to air does not cause the sensor to

  17. The Logical Syntax of Number Words: Theory, Acquisition and Processing

    ERIC Educational Resources Information Center

    Musolino, Julien

    2009-01-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. "Cognition 93", 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar…

  18. Mars sampling strategy and aeolian processes

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald

    1988-01-01

    It is critical that the geological context of planetary samples (both in situ analyses and return samples) be well known and documented. Apollo experience showed that this goal is often difficult to achieve even for a planet on which surficial processes are relatively restricted. On Mars, the variety of present and past surface processes is much greater than on the Moon and establishing the geological context of samples will be much more difficult. In addition to impact hardening, Mars has been modified by running water, periglacial activity, wind, and other processes, all of which have the potential for profoundly affecting the geological integrity of potential samples. Aeolian, or wind, processes are ubiquitous on Mars. In the absence of liquid water on the surface, aeolian activity dominates the present surface as documented by frequent dust storms (both local and global), landforms such as dunes, and variable features, i.e., albedo patterns which change their size, shape, and position with time in response to the wind.

  19. Surface studies of plasma processed Nb samples

    SciTech Connect

    Tyagi, Puneet V.; Doleans, Marc; Hannah, Brian S.; Afanador, Ralph; Stewart, Stephen; Mammosser, John; Howell, Matthew P; Saunders, Jeffrey W; Degraff, Brian D; Kim, Sang-Ho

    2015-01-01

    Contaminants present at top surface of superconducting radio frequency (SRF) cavities can act as field emitters and restrict the cavity accelerating gradient. A room temperature in-situ plasma processing technology for SRF cavities aiming to clean hydrocarbons from inner surface of cavities has been recently developed at the Spallation Neutron Source (SNS). Surface studies of the plasma-processed Nb samples by Secondary ion mass spectrometry (SIMS) and Scanning Kelvin Probe (SKP) showed that the NeO2 plasma processing is very effective to remove carbonaceous contaminants from top surface and improves the surface work function by 0.5 to 1.0 eV.

  20. Phase lock acquisition for sampled data PLLs using the sweep technique

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Brown, D. H.; Hurd, W. J.

    1986-01-01

    Simulation results of the swept-acquisition performance of residual carrier phase-locked loops (PLLs) are reported. The loops investigated are sampled data counterparts of the continuous time type II and III loops currently in use in Deep Space Network receivers. It was found that sweep rates of 0.2 B(sub L)(2) to 0.4 B(sub L)(2) Hz/s can be used, depending on the loop parameters and loop signal-to-noise ratio (SNR), where B(sub L) is the one-sided loop noise bandwidth. Type III loops are shown to be not as reliable as type II loops for acquisition using this technique, especially at low SNRs.

  1. Respirable coal mine dust sample processing

    SciTech Connect

    Raymond, L.D.; Tomb, T.F.; Parobeck, P.S.

    1987-01-01

    The Federal Coal Mine Health and Safety Act of 1969 established mandatory dust standards for coal mines. Regulatory requirements for complying with the provisions of the Act were prescribed in Title 30, Code of Federal Regulations, Parts 70 and 71, which were published in the Federal Register on April 3, 1970, and March 28, 1972, respectively. These standard and sampling requirements of coal mine operators, along with a description of the laboratory which was established to process respirable coal mine dust samples collected in accordance with these requirements, were published in MESA Informational Report (MESA, the acronym for the Mining Enforcement and Safety Administration, was changed to MSHA, the acronym for the Mine Safety and Health Administration, in 1977). These standards and regulatory requirements continued under the Federal Mine Safety and Health Act of 1977 until November 1980, when major regulatory revisions were made in the operator's dust sampling program. This paper describes the changes in the respirable coal mine dust sampling program and the equipment and procedures used by MSHA to process respirable coal mine dust samples collected in accordance with regulatory requirements. 10 figs., 1 tab.

  2. One GigaSample Per Second Data Acquisition using Available Gate Array Technology

    NASA Technical Reports Server (NTRS)

    Wagner, K.W.

    1999-01-01

    A new National Aeronautics and Space Administration instrument forced demanding requirements upon its altimeter digitizer system. Eight-bit data would be generated at a rate of one billion samples per second. NASA had never before attempted to capture such high-speed data in the radiation, low-power, no-convective-cooling, limited-board-area environment of space. This presentation describes how the gate array technology available at the time of the design was used to implement this one gigasample per second data acquisition system

  3. Oral processing of two milk chocolate samples.

    PubMed

    Carvalho-da-Silva, Ana Margarida; Van Damme, Isabella; Taylor, Will; Hort, Joanne; Wolf, Bettina

    2013-02-26

    Oral processing of two milk chocolates, identical in composition and viscosity, was investigated to understand the textural behaviour. Previous studies had shown differences in mouthcoating and related attributes such as time of clearance from the oral cavity to be most discriminating between the samples. Properties of panellists' saliva, with regard to protein concentration and profile before and after eating the two chocolates, were included in the analysis but did not reveal any correlation with texture perception. The microstructure of the chocolate samples following oral processing, which resembled an emulsion as the chocolate phase inverts in-mouth, was clearly different and the sample that was found to be more mouthcoating appeared less flocculated after 20 chews. The differences in flocculation behaviour were mirrored in the volume based particle size distributions acquired with a laser diffraction particle size analyser. The less mouthcoating and more flocculated sample showed a clear bimodal size distribution with peaks at around 40 and 500 μm, for 10 and 20 chews, compared to a smaller and then diminishing second peak for the other sample following 10 and 20 chews, respectively. The corresponding mean particle diameters after 20 chews were 184 ± 23 and 141 ± 10 μm for the less and more mouthcoating samples, respectively. Also, more of the mouthcoating sample had melted after both 10 and 20 chews (80 ± 8% compared to 72 ± 10% for 20 chews). Finally, the friction behaviour between a soft and hard surface (elastopolymer/steel) and at in-mouth temperature was investigated using a commercial tribology attachment on a rotational rheometer. Complex material behaviour was revealed. Observations included an unusual increase in friction coefficient at very low sliding speeds, initially overlapping for both samples, to a threefold higher value for the more mouthcoating sample. This was followed by a commonly observed decrease in friction coefficient with

  4. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  5. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... Regulation Supplement; DoD Voucher Processing AGENCY: Defense Acquisition Regulations System, Department of Defense (DoD). ACTION: Proposed rule. SUMMARY: DoD is proposing to amend the Defense Federal Acquisition Regulation Supplement (DFARS) to update DoD's voucher processing procedures and better accommodate the use...

  6. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, John R.

    1997-01-01

    A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.

  7. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, J.R.

    1997-02-11

    A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.

  8. Mechanical Alteration And Contamination Issues In Automated Subsurface Sample Acquisition And Handling

    NASA Astrophysics Data System (ADS)

    Glass, B. J.; Cannon, H.; Bonaccorsi, R.; Zacny, K.

    2006-12-01

    The Drilling Automation for Mars Exploration (DAME) project's purpose is to develop and field-test drilling automation and robotics technologies for projected use in missions in the 2011-15 period. DAME includes control of the drilling hardware, and state estimation of both the hardware and the lithography being drilled and the state of the hole. A sister drill was constructed for the Mars Analog Río Tinto Experiment (MARTE) project and demonstrated automated core handling and string changeout in 2005 drilling tests at Rio Tinto, Spain. DAME focused instead on the problem of drill control while actively drilling while not getting stuck. Together, the DAME and MARTE projects demonstrate a fully automated robotic drilling capability, including hands-off drilling, adjustment to different strata and downhole conditions, recovery from drilling faults (binding, choking, etc.), drill string changeouts, core acquisition and removal, and sample handling and conveyance to in-situ instruments. The 2006 top-level goal of DAME drilling in-situ tests was to verify and demonstrate a capability for hands-off automated drilling, at an Arctic Mars-analog site. There were three sets of 2006 test goals, all of which were exceeded during the July 2006 field season. The first was to demonstrate the recognition, while drilling, of at least three of the six known major fault modes for the DAME planetary-prototype drill, and to employ the correct recovery or safing procedure in response. The second set of 2006 goals was to operate for three or more hours autonomously, hands-off. And the third 2006 goal was to exceed 3m depth into the frozen breccia and permafrost with the DAME drill (it had not gone further than 2.2m previously). Five of six faults were detected and corrected, there were 43 hours of hands-off drilling (including a 4 hour sequence with no human presence nearby), and 3.2m was the total depth. And ground truth drilling used small commercial drilling equipment in parallel in

  9. Nano-Scale Sample Acquisition Systems for Small Class Exploration Spacecraft

    NASA Astrophysics Data System (ADS)

    Paulsen, G.

    2015-12-01

    The paradigm for space exploration is changing. Large and expensive missions are very rare and the space community is turning to smaller, lighter, and less expensive missions that could still perform great exploration. These missions are also within reach of commercial companies such as the Google Lunar X Prize teams that develop small scale lunar missions. Recent commercial endeavors such as "Planet Labs inc." and Sky Box Imaging, inc. show that there are new benefits and business models associated with miniaturization of space hardware. The Nano-Scale Sample Acquisition System includes NanoDrill for capture of small rock cores and PlanetVac for capture of surface regolith. These two systems are part of the ongoing effort to develop "Micro Sampling" systems for deployment by the small spacecraft with limited payload capacities. The ideal applications include prospecting missions to the Moon and Asteroids. The MicroDrill is a rotary-percussive coring drill that captures cores 7 mm in diameter and up to 2 cm long. The drill weighs less than 1 kg and can capture a core from a 40 MPa strength rock within a few minutes, with less than 10 Watt power and less than 10 Newton of preload. The PlanetVac is a pneumatic based regolith acquisition system that can capture surface sample in touch-and-go maneuver. These sampling systems were integrated within the footpads of commercial quadcopter for testing. As such, they could also be used by geologists on Earth to explore difficult to get to locations.

  10. Weapon Acquisition Program Outcomes and Efforts to Reform DOD’s Acquisition Process

    DTIC Science & Technology

    2016-05-09

    growth while in production. This represents concurrency, which can be caused by many factors, and is a contributor to cost growth. 9. As measured against...portfolio, were also in the 2005 portfolio and represent 80 percent of the portfolio’s current total acquisition cost or over $1.1 of the $1.4 trillion...continues a trend we have seen for the past decade. • When measured from first full estimates the total estimated cost of the portfolio has grown by over

  11. Facilities and the Air Force Systems Acquisition Process.

    DTIC Science & Technology

    1985-05-01

    to provide es- senti-l fLcilitio-s by, system Initial Cperatlcnal Capability (’-0C) . And secondly, vince the systems ;acjui. tior proceso is event...funds exclusively for systems acquisition. This change will remove the current military construction calendar constraint and allow facilities to be

  12. DEVELOPMENT OF MARKETABLE TYPING SKILL--SENSORY PROCESSES UNDERLYING ACQUISITION.

    ERIC Educational Resources Information Center

    WEST, LEONARD J.

    THE PROJECT ATTEMPTED TO PROVIDE FURTHER DATA ON THE DOMINANT HYPOTHESIS ABOUT THE SENSORY MECHANISMS UNDERLYING SKILL ACQUISITION IN TYPEWRITING. IN SO DOING, IT PROPOSED TO FURNISH A BASIS FOR IMPORTANT CORRECTIVES TO SUCH CONVENTIONAL INSTRUCTIONAL PROCEDURES AS TOUCH TYPING. SPECIFICALLY, THE HYPOTHESIS HAS BEEN THAT KINESTHESIS IS NOT…

  13. Developmental Stages in Receptive Grammar Acquisition: A Processability Theory Account

    ERIC Educational Resources Information Center

    Buyl, Aafke; Housen, Alex

    2015-01-01

    This study takes a new look at the topic of developmental stages in the second language (L2) acquisition of morphosyntax by analysing receptive learner data, a language mode that has hitherto received very little attention within this strand of research (for a recent and rare study, see Spinner, 2013). Looking at both the receptive and productive…

  14. Developmental Processes and Stages in the Acquisition of Cardinality.

    ERIC Educational Resources Information Center

    Bermejo, Vicente; Lago, M. Oliva

    1990-01-01

    Cardinality responses are affected by both the direction and nature of the elements in the counting sequence. Error analysis suggests six stages in the acquisition of cardinality. Although there appears to be a developmental dependency between counting and cardinality, this relationship is not significant in all cases. (RH)

  15. Processing Protocol for Soil Samples Potentially ...

    EPA Pesticide Factsheets

    Method Operating Procedures This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  16. Inverse Process Analysis for the Acquisition of Thermophysical Data

    SciTech Connect

    Jay Frankel; Adrian Sabau

    2004-10-31

    One of the main barriers in the analysis and design of materials processing and industrial applications is the lack of accurate experimental data on the thermophysical properties of materials. To date, the measurement of most of these high-temperature thermophysical properties has often been plagued by temperature lags that are inherent in measurement techniques. These lags can be accounted for with the appropriate mathematical models, reflecting the experimental apparatus and sample region, in order to deduce the desired measurement as a function of true sample temperature. Differential scanning calorimeter (DSC) measurements are routinely used to determine enthalpies of phase change, phase transition temperatures, glass transition temperatures, and heat capacities. In the aluminum, steel, and metal casting industries, predicting the formation of defects such as shrinkage voids, microporosity, and macrosegregation is limited by the data available on fraction solid and density evolution during solidification. Dilatometer measurements are routinely used to determine the density of a sample at various temperatures. An accurate determination of the thermophysical properties of materials is needed to achieve accuracy in the numerical simulations used to improve or design new material processes. In most of the instruments used to measure properties, the temperature is changed according to instrument controllers and there is a nonhomogeneous temperature distribution within the instrument. Additionally, the sample temperature cannot be measured directly: temperature data are collected from a thermocouple that is placed at a different location than that of the sample, thus introducing a time lag. The goal of this project was to extend the utility, quality and accuracy of two types of commercial instruments -a DSC and a dilatometer - used for thermophysical property measurements in high-temperature environments. In particular, the quantification of solid fraction and

  17. Possible overlapping time frames of acquisition and consolidation phases in object memory processes: a pharmacological approach

    PubMed Central

    Akkerman, Sven; Blokland, Arjan

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when administered within 2 min after the acquisition trial. Likewise, both PDE5-I and PDE4-I reversed the scopolamine deficit model when administered within 2 min after the learning trial. PDE5-I was effective up to 45 min after the acquisition trial and PDE4-I was effective when administered between 3 and 5.5 h after the acquisition trial. Taken together, our study suggests that acetylcholine, cGMP, and cAMP are all involved in acquisition processes and that cGMP and cAMP are also involved in early and late consolidation processes, respectively. Most important, these pharmacological studies suggest that acquisition processes continue for some time after the learning trial where they share a short common time frame with early consolidation processes. Additional brain concentration measurements of the drugs suggest that these acquisition processes can continue up to 4–6 min after learning. PMID:26670184

  18. Science Process Skill as a Predictor of Acquisition of Knowledge Among Preservice Teachers.

    ERIC Educational Resources Information Center

    Flehinger, Lenore Edith

    This study discusses the relationships between the level of science process skills and the degree of acquisition of new science knowledge. Participants included 257 preservice teachers enrolled in an elementary science methods course. A test, Test of Oceanographic Knowledge, was designed and used to define the level of knowledge acquisition. Level…

  19. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-04-01

    file?based runtime verification. A case study of formally specifying, validating, and verifying a set of requirements for an iPhone application that...Center for Strategic and International Studies The Making of a DoD Acquisition Lead System Integrator (LSI) Paul Montgomery, Ron Carlson, and John...code against the execution trace of the mobile apps using log file–based runtime verification. A case study of formally specifying, validating, and

  20. The Use of Small Business Administration Section 8(a) Contractors in Automatic Data Processing Acquisitions.

    DTIC Science & Technology

    2007-11-02

    OFFICE OF THE INSPECTOR GENERAL THE USE OF SMALL BUSINESS ADMINISTRATION SECTION 8(a) CONTRACTORS IN AUTOMATIC DATA PROCESSING ACQUISITIONS...CENTER Department of Defensei^KS™"* WASHINGTON D.C. 20301-7100 053,35 The following acronyms are used in this report. ADP Automatic Data Processing...Of The Inspector General: The Use Of Small Business Administration Section 8(a) Contractors in Automatic Data Processing Acquisitions Corporate

  1. Short pulse acquisition by low sampling rate with phase-coded sequence in lidar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Xu, Jiajia; Lv, Wentao; Yang, Xiaocheng

    2016-11-01

    The requirement of high range resolution results in impractical collection of every returned laser pulse due to the limited response speed of imaging detectors. This paper proposes a phase coded sequence acquisition method for signal preprocessing. The system employs an m-sequence with N bits for demonstration with the detector controlled to accumulate N+1 bits of the echo signals to deduce one single returned laser pulse. An indoor experiment achieved 2 μs resolution with the sampling period of 28 μs by employing a 15-bit m-sequence. This method shows the potential to improve the detection capabilities of narrow laser pulses with the detectors at a low frame rate, especially for the imaging lidar systems. Meanwhile, the lidar system is able to improve the range resolution with available detectors of restricted performance.

  2. Software interface and data acquisition package for the LakeShore cryogenics vibrating sample magnetometer

    SciTech Connect

    O`Dell, B.H.

    1995-11-01

    A software package was developed to replace the software provided by LakeShore for their model 7300 vibrating sample magnetometer (VSM). Several problems with the original software`s functionality caused this group to seek a new software package. The new software utilizes many features that were unsupported in the LakeShore software, including a more functional step mode, point averaging mode, vector moment measurements, and calibration for field offset. The developed software interfaces the VSM through a menu driven graphical user interface, and bypasses the VSM`s on board processor leaving control of the VSM up to the software. The source code for this software is readily available to any one. By having the source, the experimentalist has full control of data acquisition and can add routines specific to their experiment.

  3. Data acquisition and processing platform in the real-time distance measurement system with dual-comb lasers

    NASA Astrophysics Data System (ADS)

    Ni, Kai; Wang, Lanlan; Zhou, Qian; Li, Xinghui; Dong, Hao; Wang, Xiaohao

    2016-11-01

    The real-time distance measurement system with dual femtosecond comb lasers combines time-of-flight and interferometric measurement. It has advantages of wide-range, high-accuracy and fast speed at the rate about 10000 pts/s. Such a distance measurement system needs dedicated higher performance of the data acquisition and processing hardware platform to support. This paper introduces the dedicated platform of the developed absolute distance measurement system. This platform is divided into three parts according to their respective functions. First part is the data acquisition module, which function is mainly to realize the A/D conversion. In this part we designed a sampling clock adjustment module to assist the A/D conversion module to sample accurately. The sampling clock adjustment module accept a 250MHz maximum reference clock input, which from the same femtosecond laser source as the optical measurement system, then generate an output clock for the A/D converter that can be delayed up to 20ns with a resolution of 714ps. This data acquisition module can convert the analog laser pulse signal to digital signal with a 14 bits resolution and a 250 MSPS maximum sample rate. Second is the data processing and storage module consists of FPGA and DDR3 modules. The FPGA module calculates the test distance by the 16 bits digital sampling signal from the front data acquisition module. The DDR3 module implements sampling data caching. Finally part is the data transmission and peripheral interfaces module based on three DB9 and USB2.0. We can easily debug the platform in the PC and implement communication with upper machine. We tested our system used dedicate test bench in real-time. The scope of the measurement system range is 0 to 3 meters and the measurement deviation is less than 10um.

  4. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  5. An Analysis of the Support Equipment Acquisition Process and Methods Designed to Reduce Acquisition Leadtime

    DTIC Science & Technology

    1991-09-01

    For !4TIs GRA& I DTIC TAB 1-j. O urn~nc Qd D1 ."JuJt I r:. % +I 1Avajll 1c/or. Distv , Li AFIT/GLM/LSY/91S-68 AN ANALYSIS OF THE SUPPORT EQUIPMENT...a specific function or purpose (21:10). Contractor Furnished Equipment ( CFE ). Items acquired or manufactured directly by the contractor for use in the...process, or output: 1) inputs are those items or resources used by the system which allow iL o func 4 ion ; 2) processes or transforms inputs into outputs

  6. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... selection process for procurements not to exceed the simplified acquisition threshold. References to FAR 36... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section...

  7. Automated collection and processing of environmental samples

    DOEpatents

    Troyer, Gary L.; McNeece, Susan G.; Brayton, Darryl D.; Panesar, Amardip K.

    1997-01-01

    For monitoring an environmental parameter such as the level of nuclear radiation, at distributed sites, bar coded sample collectors are deployed and their codes are read using a portable data entry unit that also records the time of deployment. The time and collector identity are cross referenced in memory in the portable unit. Similarly, when later recovering the collector for testing, the code is again read and the time of collection is stored as indexed to the sample collector, or to a further bar code, for example as provided on a container for the sample. The identity of the operator can also be encoded and stored. After deploying and/or recovering the sample collectors, the data is transmitted to a base processor. The samples are tested, preferably using a test unit coupled to the base processor, and again the time is recorded. The base processor computes the level of radiation at the site during exposure of the sample collector, using the detected radiation level of the sample, the delay between recovery and testing, the duration of exposure and the half life of the isotopes collected. In one embodiment, an identity code and a site code are optically read by an image grabber coupled to the portable data entry unit.

  8. Data-Driven Sampling Matrix Boolean Optimization for Energy-Efficient Biomedical Signal Acquisition by Compressive Sensing.

    PubMed

    Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao

    2017-04-01

    Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.

  9. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  10. System safety management lessons learned from the US Army acquisition process

    SciTech Connect

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They are broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.

  11. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  12. Evaluation of the Pre-Milestone I Acquisition Logistics Process at the Aeronautical Systems Center

    DTIC Science & Technology

    1994-09-01

    schedule, and performance was developed to reduce risks and assure specified performances. 13 All acquisition programs are based on identified needs...Engineerng and Emphasizes risk management; promising approach is Manufacturing translated into a stable, predicable, cost effective Development design...shortcomings or cost overruns [31:7-8]. 16 The acquisition process is an incremental development commitment phased so that the associated risk is continually

  13. An Information Processing Approach to Skill Acquisition: Perception and Timing.

    ERIC Educational Resources Information Center

    Rothstein, Anne L.

    In order to understand learners and players in relation to environments typically found in sport, it is necessary to first understand the individual as an information processor who must sample information from the environment, interpret it, organize or select an appropriate motor response, and execute that response. One of the most difficult…

  14. High-throughput data acquisition and processing for real-time x-ray imaging

    NASA Astrophysics Data System (ADS)

    Vogelgesang, Matthias; Rota, Lorenzo; Ardila Perez, Luis E.; Caselle, Michele; Chilingaryan, Suren; Kopmann, Andreas

    2016-10-01

    With ever-increasing data rates due to stronger light sources and better detectors, X-ray imaging experiments conducted at synchrotron beamlines face bandwidth and processing limitations that inhibit efficient workflows and prevent real-time operations. We propose an experiment platform comprised of programmable hardware and optimized software to lift these limitations and make beamline setups future-proof. The hardware consists of an FPGA-based data acquisition system with custom logic for data pre-processing and a PCIe data connection for transmission of currently up to 6.6 GB/s. Moreover, the accompanying firmware supports pushing data directly into GPU memory using AMD's DirectGMA technology without crossing system memory first. The GPUs are used to pre-process projection data and reconstruct final volumetric data with OpenCL faster than possible with CPUs alone. Besides, more efficient use of resources this enables a real-time preview of a reconstruction for early quality assessment of both experiment setup and the investigated sample. The entire system is designed in a modular way and allows swapping all components, e.g. replacing our custom FPGA camera with a commercial system but keep reconstructing data with GPUs. Moreover, every component is accessible using a low-level C library or using a high-level Python interface in order to integrate these components in any legacy environment.

  15. The Effects of Processing Instruction and Its Components on the Acquisition of Gender Agreement in Italian

    ERIC Educational Resources Information Center

    Benati, Alessandro

    2004-01-01

    This paper reports an experimental investigation of the relative effects of processing instruction, structured input activities and explicit information on the acquisition of gender agreement in Italian adjectives. Subjects were divided into three groups: the first received processing instruction; the second group structured input only; the third…

  16. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  17. Acquisition of Basic Science Process Skills among Malaysian Upper Primary Students

    ERIC Educational Resources Information Center

    Ong, Eng Tek; Ramiah, Puspa; Ruthven, Kenneth; Salleh, Sabri Mohd; Yusuff, Nik Azmah Nik; Mokhsein, Siti Eshah

    2015-01-01

    This study aims to determine whether there are significant differences in the acquisition of basic science process skills by gender, school location and by grade levels among upper primary school students. Using an established 36-item Basic Science Process Skills test that assesses the skills of observing, communicating, classifying, measuring,…

  18. Executive and Phonological Processes in Second-Language Acquisition

    ERIC Educational Resources Information Center

    Engel de Abreu, Pascale M. J.; Gathercole, Susan E.

    2012-01-01

    This article reports a latent variable study exploring the specific links among executive processes of working memory, phonological short-term memory, phonological awareness, and proficiency in first (L1), second (L2), and third (L3) languages in 8- to 9-year-olds experiencing multilingual education. Children completed multiple L1-measures of…

  19. 76 FR 68037 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General Services Administration (GSA... prohibition on contracting with entities that conduct restricted business operations in Sudan. This rule adds... that conducts restricted business operations in Sudan. The rule also describes the consultation...

  20. Accelerating COTS Middleware Acquisition: The i-Mate Process

    SciTech Connect

    Liu, Anna; Gorton, Ian

    2003-03-05

    Most major organizations now use some commercial-off-the-shelf middleware components to run their businesses. Key drivers behind this growth include ever-increasing Internet usage and the ongoing need to integrate heterogeneous legacy systems to streamline business processes. As organizations do more business online, they need scalable, high-performance software infrastructures to handle transactions and provide access to core systems.

  1. Human Processing of Knowledge from Texts: Acquisition, Integration, and Reasoning

    DTIC Science & Technology

    1979-06-01

    or partial ordering of all constituent elements (Barclay, 1973; Foos, Smith, Sabol, & Mynatt , 1976; Hayes-Roth & Hayes-Roth, 1975; Potts, 1972, 1977...variables. and Speech, 1966, ~. 217-227. Language Foos, P. W., Smith, K. H., Sabol, M. A., and Mynatt , B. T. Constructive processes in simple linear-order

  2. Are Changes in Acquisition Policy and Process and in Funding Climate Associated With Cost Growth?

    DTIC Science & Technology

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= tÉÇåÉëÇ~ó=pÉëëáçåë= sçäìãÉ=f= = Are Changes in Acquisition Policy and Process and in Funding Climate ...acquisition policy and process. Changes in funding climate , however, are found to have a large influence on PAUC growth. These findings have three...fåÑçêãÉÇ=`Ü~åÖÉ= - 11 - Are Changes in Acquisition Policy and Process and in Funding Climate Associated With Cost Growth? David McNicol—joined the DoD

  3. Evidence on the Effect of DoD Acquisition Policy and Process and Funding Climate on Cancellations of Major Defense Acquisitions Programs

    DTIC Science & Technology

    2015-05-01

    growth and changes over time in acquisition policies and processes, after taking funding climate into account. McN-W (2014) found that there is not a...reasonable summary conclusion is that neither acquisition policy and process changes nor changes in funding climate have had much, if any, effect on... changes in funding climate are strongly associated with cancellations. They arguably have been crucial. DoD force structure, the capabilities that

  4. Acquisition times of carrier tracking sampled data phase-locked loops

    NASA Technical Reports Server (NTRS)

    Aguirre, S.

    1986-01-01

    Phase acquisition times of type II and III loops typical of the Advanced Receiver are studied by computer simulations when the loops are disturbed by gaussian noise. Reliable estimates are obtained by running 5000 trials for each combination of loop signal-to-noise ratio (SNR) and frequency offset. The probabilities of acquisition are shown versus time from start of acquisition for various loop SNRs and frequency offsets. For frequency offsets smaller than one-fourth of the loop bandwidth and for loop SNRs of 10 dB and higher, the loops acquire with probability 0.99 within 2.5 B sub L for type II loops and within 7/B sub L for type III loops.

  5. An Acquisition Guide for Executives

    EPA Pesticide Factsheets

    This guide covers the following subjects; What is Acquisition?, Purpose and Primary Functions of the Agency’s Acquisition System, Key Organizations in Acquisitions, Legal Framework, Key Players in Acquisitions, Acquisition Process, Acquisition Thresholds

  6. Effect of Intermittent Reinforcement on Acquisition and Retention in Delayed Matching-to-Sample in Pigeons

    ERIC Educational Resources Information Center

    Grant, Douglas S.

    2011-01-01

    Experiments 1 and 2 involved independent groups that received primary reinforcement after a correct match with a probability of 1.0, 0.50 or 0.25. Correct matches that did not produce primary reinforcement produced a conditioned reinforcer. Both experiments revealed little evidence that acquisition or retention was adversely affected by use of…

  7. Characteristics of Marijuana Acquisition among a National Sample of Adolescent Users

    ERIC Educational Resources Information Center

    King, Keith A.; Merianos, Ashley L.; Vidourek, Rebecca A.

    2016-01-01

    Background: Because marijuana is becoming more accessible and perceived norms of use are becoming increasingly more favorable, research is needed to understand characteristics of marijuana acquisition among adolescents. Purpose: The study purpose was to examine whether sources and locations where adolescent users obtain and use marijuana differed…

  8. Missing Sample Recovery for Wireless Inertial Sensor-Based Human Movement Acquisition.

    PubMed

    Kim, Kyoung Jae; Agrawal, Vibhor; Gaunaurd, Ignacio; Gailey, Robert S; Bennett, Christopher L

    2016-11-01

    This paper presents a novel, practical, and effective routine to reconstruct missing samples from a time-domain sequence of wirelessly transmitted IMU data during high-level mobility activities. Our work extends previous approaches involving empirical mode decomposition (EMD)-based and auto-regressive (AR) model-based interpolation algorithms in two aspects: 1) we utilized a modified sifting process for signal decomposition into a set of intrinsic mode functions with missing samples, and 2) we expand previous AR modeling for recovery of audio signals to exploit the quasi-periodic characteristics of lower-limb movement during the modified Edgren side step test. To verify the improvements provided by the proposed extensions, a comparison study of traditional interpolation methods, such as cubic spline interpolation, AR model-based interpolations, and EMD-based interpolation is also made via simulation with real inertial signals recorded during high-speed movement. The evaluation was based on two performance criteria: Euclidian distance and Pearson correlation coefficient between the original signal and the reconstructed signal. The experimental results show that the proposed method improves upon traditional interpolation methods used in recovering missing samples.

  9. Quantitative modal determination of geological samples based on X-ray multielemental map acquisition.

    PubMed

    Cossio, Roberto; Borghi, Alessandro; Ruffini, Raffaella

    2002-04-01

    Multielemental X-ray maps collected by a remote scanning system of the electron beam are processed by a dedicated software program performing accurate modal determination of geological samples. The classification of different mineral phases is based on elemental concentrations. The software program Petromod loads the maps into a database and computes a matrix consisting of numerical values proportional to the elemental concentrations. After an initial calibration, the program can perform the chemical composition calculated on the basis of a fixed number of oxygens for a selected area. In this way, it is possible to identify all the mineral phases occurring in the sample. Up to three elements can be selected to calculate the modal percentage of the identified mineral. An automated routine scans the whole set of maps and assigns each pixel that satisfies the imposed requirements to the selected phase. Repeating this procedure for every mineral phase occurring in the mapped area, a modal distribution of the rock-forming minerals can be performed. The final output consists of a digitized image, which can be further analyzed by common image analysis software, and a table containing the calculated modal percentages. The method is here applied to a volcanic and a metamorphic rock sample.

  10. Learning and Individual Differences: An Ability/Information-Processing Framework for Skill Acquisition. Final Report.

    ERIC Educational Resources Information Center

    Ackerman, Phillip L.

    A program of theoretical and empirical research focusing on the ability determinants of individual differences in skill acquisition is reviewed. An integrative framework for information-processing and cognitive ability determinants of skills is reviewed, along with principles for ability-skill relations. Experimental manipulations were used to…

  11. Individual Variation in Infant Speech Processing: Implications for Language Acquisition Theories

    ERIC Educational Resources Information Center

    Cristia, Alejandrina

    2009-01-01

    To what extent does language acquisition recruit domain-general processing mechanisms? In this dissertation, evidence concerning this question is garnered from the study of individual differences in infant speech perception and their predictive value with respect to language development in early childhood. In the first experiment, variation in the…

  12. High resolution x-ray medical sequential image acquisition and processing system based on PCI interface

    NASA Astrophysics Data System (ADS)

    Lu, Dongming; Chen, Qian; Gu, Guohua

    2003-11-01

    In the field of medical application, it is of great importance to adopt digital image processing technique. Based on the characteristics of medical image, we introduced the digital image processing method to the X-ray imaging system, and developed a high resolution x-ray medical sequential image acquisition and processing system that employs image enhancer and CCD. This system consists of three basic modules, namely sequential image acquisition, data transfer and system control, and image processing. Under the control of FPGA (Field Programmable Gate Array), images acquired by the front-end circuit are transmitted to a PC through high speed PCI bus, and then optimized by the image processing program. The software kits, which include PCI Device Driver and Image Processing Package, are developed with Visual C++ Language based on Windows OS. In this paper, we present a general introduction to the principle and the operating procedure of X-ray Sequential Image Acquisition and Processing System, with special emphasis on the key issues of the hardware design. In addition, the context, principle, status quo and the digitizing trend of X-ray Imaging are explained succinctly. Finally, the preliminary experimental results are shown to demonstrate that the system is capable of achieving high quality X-ray sequential images.

  13. Optimized list-mode acquisition and data processing procedures for ACS2 based PET systems.

    PubMed

    Langner, Jens; Bühler, Paul; Just, Uwe; Pötzsch, Christian; Will, Edmund; van den Hoff, Jörg

    2006-01-01

    PET systems using the acquisition control system version 2 (ACS2), e.g. the ECAT Exact HR PET scanner series, offer a rather restricted list-mode functionality. For instance, typical transfers of acquisition data consume a considerable amount of time. This represents a severe obstacle to the utilization of potential advantages of list-mode acquisition. In our study, we have developed hardware and software solutions which do not only allow for the integration of list-mode into routine procedures, but also improve the overall runtime stability of the system. We show that our methods are able to speed up the transfer of the acquired data to the image reconstruction and processing workstations by a factor of up to 140. We discuss how this improvement allows for the integration of list-mode-based post-processing methods such as an event-driven movement correction into the data processing environment, and how list-mode is able to improve the overall flexibility of PET investigations in general. Furthermore, we show that our methods are also attractive for conventional histogram-mode acquisition, due to the improved stability of the ACS2 system.

  14. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  15. SPARTACUS - A new system of data acquisition and processing for ultrasonic examination

    NASA Astrophysics Data System (ADS)

    Benoist, Ph.; Cartier, F.; Chapius, N.; Pincemaille, G.

    SPARTACUS, a novel data acquisition and processing system for ultrasonic examination, was developed in order to overcome the problem in which all the techniques of characterization, sizing, or of improving the SNR making use of information processing cannot be employed because the complete form of the HF signal and hence its frequency content are not accessible. In acquisition mode, SPARTACUS helps to record all the waveforms continuously in numerical form, and at a rate compatible with industrial requirements. In processing mode, SPARTACUS offers vast processing and imaging possibilities, which makes it possible to set up the analytical method adapted to a specific problem, so that the industrial operator has a tool capable of diagnostic automation in complex testing situations.

  16. Modular Automated Processing System (MAPS) for analysis of biological samples.

    SciTech Connect

    Gil, Geun-Cheol; Chirica, Gabriela S.; Fruetel, Julia A.; VanderNoot, Victoria A.; Branda, Steven S.; Schoeniger, Joseph S.; Throckmorton, Daniel J.; Brennan, James S.; Renzi, Ronald F.

    2010-10-01

    We have developed a novel modular automated processing system (MAPS) that enables reliable, high-throughput analysis as well as sample-customized processing. This system is comprised of a set of independent modules that carry out individual sample processing functions: cell lysis, protein concentration (based on hydrophobic, ion-exchange and affinity interactions), interferent depletion, buffer exchange, and enzymatic digestion of proteins of interest. Taking advantage of its unique capacity for enclosed processing of intact bioparticulates (viruses, spores) and complex serum samples, we have used MAPS for analysis of BSL1 and BSL2 samples to identify specific protein markers through integration with the portable microChemLab{trademark} and MALDI.

  17. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing.

  18. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-05-01

    Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes Charles K. Pickar, Naval Postgraduate School Raymond D...control number. 1. REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Engineering the Business ...Information Technology and Business Process Redesign | MIT Sloan Management Review. MIT Sloan Management Review. Retrieved from http://sloanreview.mit.edu

  19. A high speed data acquisition and processing system for real time data analysis and control

    NASA Astrophysics Data System (ADS)

    Ferron, J. R.

    1992-11-01

    A high speed data acquisition system which is closely coupled with a high speed digital processor is described. Data acquisition at a rate of 40 million 14 bit data values per second is possible simultaneously with data processing at a rate of 80 million floating point operations per second. This is achieved by coupling a commercially available VME format single board computer based on the Intel i860 microprocessor with a custom designed first-in, first-out memory circuit that transfers data at high speed to the processor board memory. Parallel processing to achieve increased computation speed is easily implemented because the data can be transferred simultaneously to multiple processor boards. Possible applications include high speed process control and real time data reduction. A specific example is described in which this hardware is used to implement a feedback control system for 18 parameters which uses 100 input signals and achieves a 100 μs cycle time.

  20. Development of a safeguards data acquisition system for the process monitoring of a simulated reprocessing facility

    SciTech Connect

    Wachter, J.W.

    1986-01-01

    As part of the Consolidated Fuel Reprocessing Program of the Fuel Recycle Division at the Oak Ridge National Laboratory (ORNL), an Integrated Process Demonstration (IPD) facility has been constructed for development of reprocessing plant technology. Through the use of cold materials, the IPD facility provides for the integrated operation of the major equipment items of the chemical-processing portion of a nuclear fuel reprocessing plant. The equipment, processes, and the extensive use of computers in data acquisition and control are prototypical of future reprocessing facilities and provide a unique test-bed for nuclear safeguards demonstrations. The data acquisition and control system consists of several microprocessors that communicate with one another and with a host minicomputer over a common data highway. At intervals of a few minutes, a ''snapshot'' is taken of the process variables, and the data are transmitted to a safeguards computer and minicomputer work station for analysis. This paper describes this data acquisition system and the data-handling procedures leading to microscopic process monitoring for safeguards purposes.

  1. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  2. Valve For Extracting Samples From A Process Stream

    NASA Technical Reports Server (NTRS)

    Callahan, Dave

    1995-01-01

    Valve for extracting samples from process stream includes cylindrical body bolted to pipe that contains stream. Opening in valve body matched and sealed against opening in pipe. Used to sample process streams in variety of facilities, including cement plants, plants that manufacture and reprocess plastics, oil refineries, and pipelines.

  3. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  4. Bacterial counts associated with poultry processing at different sampling times.

    PubMed

    Geornaras, I; von Holy, A

    2000-01-01

    Aerobic plate counts, Enterobacteriaceae counts and Pseudomonas counts were performed on neck skin samples from six processing steps in a poultry abattoir at three different sampling times. Sampling time 1 was shortly after start-up of processing operations, time 2 after a tea break which was preceded by a cold water rinse-down of equipment surfaces, and time 3 before shut-down. No significant differences (P > 0.05) in microbial numbers of neck skin samples were observed between the three sampling times at the six sampling sites. At this particular processing plant, therefore, sampling at any time of the processing shift would thus not lead to significantly different bacterial counts of neck skins. The lowest aerobic plate counts, over all three sampling times, were obtained for neck skins sampled after spray washing, and the highest for neck skins sampled after packaging. This indicated the efficacy of the washing step in reducing microbial contamination but subsequent re-contamination of carcasses. Despite the Pseudomonas counts of neck skins being lower than the Enterobacteriaceae counts at the beginning of processing, packaging of carcasses resulted in Pseudomonas counts that were higher than the Enterobacteriaceae counts.

  5. Revised sampling campaigns to provide sludge for treatment process testing

    SciTech Connect

    PETERSEN, C.A.

    1999-02-18

    The purpose of this document is to review the impact to the sludge sampling campaigns planned for FY 1999 given the recent decision to delete any further sludge sampling in the K West Basin. Requirements for Sludge sample material for Sludge treatment process testing are reviewed. Options are discussed for obtaining the volume sample material required and an optimized plan for obtaining this sludge is summarized.

  6. Distributed real time data processing architecture for the TJ-II data acquisition system

    SciTech Connect

    Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Vega, J.; Sanchez, E.

    2004-10-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions.

  7. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees.

  8. Summary of the activities of the subgroup on data acquisition and processing

    SciTech Connect

    Connolly, P.L.; Doughty, D.C.; Elias, J.E.

    1981-01-01

    A data acquisition and handling subgroup consisting of approximately 20 members met during the 1981 ISABELLE summer study. Discussions were led by members of the BNL ISABELLE Data Acquisition Group (DAG) with lively participation from outside users. Particularly large contributions were made by representatives of BNL experiments 734, 735, and the MPS, as well as the Fermilab Colliding Detector Facility and the SLAC LASS Facility. In contrast to the 1978 study, the subgroup did not divide its activities into investigations of various individual detectors, but instead attempted to review the current state-of-the-art in the data acquisition, trigger processing, and data handling fields. A series of meetings first reviewed individual pieces of the problem, including status of the Fastbus Project, the Nevis trigger processor, the SLAC 168/E and 3081/E emulators, and efforts within DAG. Additional meetings dealt with the question involving specifying and building complete data acquisition systems. For any given problem, a series of possible solutions was proposed by the members of the subgroup. In general, any given solution had both advantages and disadvantages, and there was never any consensus on which approach was best. However, there was agreement that certain problems could only be handled by systems of a given power or greater. what will be given here is a review of various solutions with associated powers, costs, advantages, and disadvantages.

  9. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  10. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  11. An Analysis of the Acquisition Process at the End of the Fiscal Year.

    DTIC Science & Technology

    1981-12-01

    34cradle to grave" process; acquisition, the contracting officer’s role , is steps 4-10, or from definition of purchase require- ment through contract...The role of the contracting officer is that of acquiring items and services to support the defense mission. The role of the requisitioner is to complete...major commands. Also the study of obligational rates of multi-year funds may prove to be en- lightening as to the role which Congressional control in the

  12. Implementing Electronic Data Interchange (EDI) with Small Business Suppliers in the Pre-Award Acquisition Process

    DTIC Science & Technology

    1993-06-01

    initiative " Electronic Commerce through EDI." Consistent with the DoD initiative to implement EDI with industry, participation of small businesses in the pre...paperwork associated with the pre-award acquisition process, electronic commerce is being integrated with EDI through electronic bulletin boards...This thesis will explore the issues surrounding DoD’s successfully implementing the use of Electronic Commerce / Electronic Data Interchange (EC/EDI

  13. Instrumental improvements and sample preparations that enable reproducible, reliable acquisition of mass spectra from whole bacterial cells

    PubMed Central

    Alusta, Pierre; Buzatu, Dan; Williams, Anna; Cooper, Willie-Mae; Tarasenko, Olga; Dorey, R Cameron; Hall, Reggie; Parker, W Ryan; Wilkes, Jon G

    2015-01-01

    Rationale Rapid sub-species characterization of pathogens is required for timely responses in outbreak situations. Pyrolysis mass spectrometry (PyMS) has the potential to be used for this purpose. Methods However, in order to make PyMS practical for traceback applications, certain improvements related to spectrum reproducibility and data acquisition speed were required. The main objectives of this study were to facilitate fast detection (<30 min to analyze 6 samples, including preparation) and sub-species-level bacterial characterization based on pattern recognition of mass spectral fingerprints acquired from whole cells volatilized and ionized at atmospheric pressure. An AccuTOF DART mass spectrometer was re-engineered to permit ionization of low-volatility bacteria by means of Plasma Jet Ionization (PJI), in which an electric discharge, and, by extension, a plasma beam, impinges on sample cells. Results Instrumental improvements and spectral acquisition methodology are described. Performance of the re-engineered system was assessed using a small challenge set comprised of assorted bacterial isolates differing in identity by varying amounts. In general, the spectral patterns obtained allowed differentiation of all samples tested, including those of the same genus and species but different serotypes. Conclusions Fluctuations of ±15% in bacterial cell concentrations did not substantially compromise replicate spectra reproducibility. © 2015 National Center for Toxicological Research. Rapid Communications in Mass Spectrometry published by John Wiley & Sons Ltd. PMID:26443394

  14. Sample Handling and Processing on Mars for Future Astrobiology Missions

    NASA Technical Reports Server (NTRS)

    Beegle, Luther; Kirby, James P.; Fisher, Anita; Hodyss, Robert; Saltzman, Alison; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2011-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes especially when detecting low concentration organic molecules that may identify extraterrestrial life. Sample processing for analytical instruments is time, resource and manpower consuming in terrestrial laboratories. Every step in this laborious process will have to be automated for in situ life detection. We have developed, and are currently demonstrating, an automated wet chemistry preparation system that can operate autonomously on Earth and is designed to operate under Martian ambient conditions. This will enable a complete wet chemistry laboratory as part of future missions. Our system, namely the Automated Sample Processing System (ASPS) receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species and delivers sample to multiple instruments for analysis (including for non-organic soluble species).

  15. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    NASA Astrophysics Data System (ADS)

    Schickert, Martin

    2015-03-01

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  16. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    SciTech Connect

    Schickert, Martin

    2015-03-31

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  17. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  18. Cycles in finite samples and cumulative processes of higher orders

    NASA Astrophysics Data System (ADS)

    Klemeš, VíT.; Klemeš, Ivo

    1988-01-01

    The process formed by a sequence of cumulative departures from the mean or from some other constant (residual mass curve, cusum chart) is a popular tool for the representation and analysis of time series in many sciences, for example, in hydrology, climatology, economics, game theory. In these and other natural and social sciences, similar cumulative processes also often arise naturally; examples include fluctuations of storage in a dam with a constant release rate, lake levels, volume of glaciers, biomass, inventories, and bank accounts. Moreover, many natural economic and other phenomena may represent, or contain, components of cumulative processes of higher orders, i.e., cumulative processes of cumulative processes. In this paper we show that for a sample {yt(0)}≡{xt} of any finite size N, the pure cumulative process of nth order, yt(n)≡∑i=1t(yi(n-1) - μ(n-1)), where μ(n-1) is the sample mean of {yt(n-1)} and t=1, 2, …, N, converges for n→∞ to a sine wave with a period equal to an integral fraction of the sample size N. This happens for any initial sample {yt(0)} and the convergence is of an exponential order. For samples from most stochastic as well as deterministic processes, the period of the limiting sine wave is equal to the sample size N. This behavior is demonstrated by examples involving samples from various processes ranging from pure random series to various deterministic series and including time series of some natural processes such as streamflow, lake levels, and glacier volumes. The paper includes a demonstration of effects of noise superimposed on, and of error in the value of, sample mean on the rate of convergence, and a discussion of some practical implications of the phenomenon described; it brings together some aspects of the work of Slutzky (1937), Hurst (1951), and Yule (1926).

  19. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    PubMed

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR.

  20. Measuring Acquisition Workforce Quality through Dynamic Knowledge and Performance: An Exploratory Investigation to Interrelate Acquisition Knowledge with Process Maturity

    DTIC Science & Technology

    2013-10-08

    9  Figure 5.  Combined Score- PCO Relationship ............................................... 20  Figure 6.  Organization T...Score- PCO Relationship ........................................ 20  Figure 7.  Organization R Score- PCO Relationship...stocks from two DoD contracting centers including Procuring Contracting Officer ( PCO ) assignment, Defense Acquisition Workforce Improvement Act (DAWIA

  1. Hardware System for Real-Time EMG Signal Acquisition and Separation Processing during Electrical Stimulation.

    PubMed

    Hsueh, Ya-Hsin; Yin, Chieh; Chen, Yan-Hong

    2015-09-01

    The study aimed to develop a real-time electromyography (EMG) signal acquiring and processing device that can acquire signal during electrical stimulation. Since electrical stimulation output can affect EMG signal acquisition, to integrate the two elements into one system, EMG signal transmitting and processing method has to be modified. The whole system was designed in a user-friendly and flexible manner. For EMG signal processing, the system applied Altera Field Programmable Gate Array (FPGA) as the core to instantly process real-time hybrid EMG signal and output the isolated signal in a highly efficient way. The system used the power spectral density to evaluate the accuracy of signal processing, and the cross correlation showed that the delay of real-time processing was only 250 μs.

  2. Double Shell Tank (DST) Process Waste Sampling Subsystem Specification

    SciTech Connect

    RASMUSSEN, J.H.

    2000-05-03

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Process Waste Sampling Subsystem which supports the first phase of Waste Feed Delivery.

  3. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  4. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  5. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  6. Autonomous Closed-Loop Tasking, Acquisition, Processing, and Evaluation for Situational Awareness Feedback

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Dan; Cappelaere, Pat

    2016-01-01

    This presentation describes the closed loop satellite autonomy methods used to connect users and the assets on Earth Orbiter- 1 (EO-1) and similar satellites. The base layer is a distributed architecture based on Goddard Mission Services Evolution Concept (GMSEC) thus each asset still under independent control. Situational awareness is provided by a middleware layer through common Application Programmer Interface (API) to GMSEC components developed at GSFC. Users setup their own tasking requests, receive views into immediate past acquisitions in their area of interest, and into future feasibilities for acquisition across all assets. Automated notifications via pubsub feeds are returned to users containing published links to image footprints, algorithm results, and full data sets. Theme-based algorithms are available on-demand for processing.

  7. Monitoring of Extraction Efficiency by a Sample Process Control Virus Added Immediately Upon Sample Receipt.

    PubMed

    Ruhanya, Vurayai; Diez-Valcarce, Marta; D'Agostino, Martin; Cook, Nigel; Hernández, Marta; Rodríguez-Lázaro, David

    2015-12-01

    When analysing food samples for enteric viruses, a sample process control virus (SPCV) must be added at the commencement of the analytical procedure, to verify that the analysis has been performed correctly. Samples can on occasion arrive at the laboratory late in the working day or week. The analyst may consequently have insufficient time to commence and complete the complex procedure, and the samples must consequently be stored. To maintain the validity of the analytical result, it will be necessary to consider storage as part of the process, and the analytical procedure as commencing on sample receipt. The aim of this study was to verify that an SPCV can be recovered after sample storage, and thus indicate the effective recovery of enteric viruses. Two types of samples (fresh and frozen raspberries) and two types of storage (refrigerated and frozen) were studied using Mengovirus vMC0 as SPCV. SPCV recovery was not significantly different (P > 0.5) regardless of sample type or duration of storage (up to 14 days at -20 °C). Accordingly, samples can be stored without a significant effect on the performance of the analysis. The results of this study should assist the analyst by demonstrating that they can verify that viruses can be extracted from food samples even if samples have been stored.

  8. Fault recognition depending on seismic acquisition and processing for application to geothermal exploration

    NASA Astrophysics Data System (ADS)

    Buness, H.; von Hartmann, H.; Rumpel, H.; Krawczyk, C. M.; Schulz, R.

    2011-12-01

    Fault systems offer a large potential for deep hydrothermal energy extraction. Most of the existing and planned projects rely on enhanced permeability assumed to be connected with them. Target depth of hydrothermal exploration in Germany is in the order of 3 -5 km to ensure an economic operation due to moderate temperature gradients. 3D seismics is the most appropriate geophysical method to image fault systems at these depth, but also one of the most expensive ones. It constitutes a significant part of the total project costs, so its application was (and is) discussed. Cost reduction in principle can be achieved by sparse acquisition. However, the decreased fold inevitably leads to a decreased S/N ratio. To overcome this problem, the application of the CRS (Common Reflection Surface) method has been proposed. The stacking operator of the CRS method inherently includes more traces than the conventional NMO/DMO stacking operator and hence a better S/N ratio can be achieved. We tested this approach using exiting 3D seismic datasets of the two most important hydrothermal provinces in Germany, the Upper Rhine Graben (URG) and the German Molasse Basin (GMB). To simulate a sparse acquisition, we reduced the amount of data to a quarter respectively a half and did a reprocessing of the data, including new velocity analysis and residual static corrections. In the URG, the utilization of the variance cube as basis for a horizon bound window amplitude analysis has been successful for the detection of small faults, which would hardly be recognized in seismic sections. In both regions, CRS processing undoubtedly improved the imaging of small faults in the complete as well as in the reduced versions of the datasets. However, CRS processing could not compensate the loss of resolution due to the reduction associated with the simulated sparse acquisition, and hence smaller faults became undetectable. The decision for a sparse acquisition of course depends on the scope of the survey

  9. Multichannel optical signal processing using sampled fiber Bragg gratings

    NASA Astrophysics Data System (ADS)

    Zhang, Guiju; Wang, Chinhua; Zhu, Xiaojun

    2008-12-01

    Sampled and linearly chirped fiber Bragg gratings provide multiple wavelength responses and linear group delays (constant dispersions) within each of the wavelength channels. We show that the sampled and chirped fiber Bragg gratings can be used to perform multiwavelength signal processing. In particular, we demonstrate, by numerical simulation, their use for performing real-time Fourier transform (RTFT) and for pulse repetition rate multiplication (PRRM) simultaneously over multiple wavelength channels. To present how the sampled fiber Bragg gratings perform the multichannel optical signal processing, a 9-channel sampled fiber grating with 100GHz channel spacing was designed and the effect of ripples in both amplitude and the group delay channel on the performance of the signal processing was examined and discussed.

  10. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future.

  11. Determinants of famous name processing speed: age of acquisition versus semantic connectedness.

    PubMed

    Smith-Spark, James H; Moore, Viv; Valentine, Tim

    2013-02-01

    The age of acquisition (AoA) and the amount of biographical information known about celebrities have been independently shown to influence the processing of famous people. In this experiment, we investigated the facilitative contribution of both factors to famous name processing. Twenty-four mature adults participated in a familiarity judgement task, in which the names of famous people were grouped orthogonally by AoA and by the number of bits of biographical information known about them (number of facts known; NoFK). Age of acquisition was found to have a significant effect on both reaction time (RT) and accuracy of response, but NoFK did not. The RT data also revealed a significant AoA×NoFK interaction. The amount of information known about a celebrity played a facilitative role in the processing of late-acquired, but not early-acquired, celebrities. Once AoA is controlled, it would appear that the semantic system ceases to have a significant overall influence on the processing of famous people. The pre-eminence of AoA over semantic connectedness is considered in the light of current theories of AoA and how their influence might interact.

  12. Preliminary study of the EChO data sampling and processing

    NASA Astrophysics Data System (ADS)

    Farina, M.; Di Giorgio, A. M.; Focardi, M.; Pace, E.; Micela, G.; Galli, E.; Giusi, G.; Liu, S. J.; Pezzuto, S.

    2014-08-01

    The EChO Payload is an integrated spectrometer with six different channels covering the spectral range from the visible up to the thermal infrared. A common Instrument Control Unit (ICU) implements all the instrument control and health monitoring functionalities as well as all the onboard science data processing. To implement an efficient design of the ICU on board software, separate analysis of the unit requirements are needed for the commanding and housekeeping collection as well as for the data acquisition, sampling and compression. In this work we present the results of the analysis carried out to optimize the EChO data acquisition and processing chain. The HgCdTe detectors used for EChO mission allow for non-destructive readout modes, such that the charge may be read without removing it after reading out. These modes can reduce the equivalent readout noise and the gain in signal to noise ratio can be computed using well known relations based on fundamental principles. In particular, we considered a multiaccumulation approach based on non-destructive reading of detector samples taken at equal time intervals. All detectors are periodically reset after a certain number of samples have been acquired and the length of the reset interval, as well as the number of samples and the sampling rate can be adapted to the brightness of the considered source. The estimation of the best set of parameters for the signal to noise ratio optimization and of the best sampling technique has been done by taking into account also the needs of mitigating the expected radiation effects on the acquired data. Cosmic rays can indeed be one of the major sources of data loss for a space observatory, and the studies made for the JWST mission allowed us to evaluate the actual need of the implementation of a dedicated deglitching procedure on board EChO.

  13. Memory acquisition and retrieval impact different epigenetic processes that regulate gene expression

    PubMed Central

    2015-01-01

    Background A fundamental question in neuroscience is how memories are stored and retrieved in the brain. Long-term memory formation requires transcription, translation and epigenetic processes that control gene expression. Thus, characterizing genome-wide the transcriptional changes that occur after memory acquisition and retrieval is of broad interest and importance. Genome-wide technologies are commonly used to interrogate transcriptional changes in discovery-based approaches. Their ability to increase scientific insight beyond traditional candidate gene approaches, however, is usually hindered by batch effects and other sources of unwanted variation, which are particularly hard to control in the study of brain and behavior. Results We examined genome-wide gene expression after contextual conditioning in the mouse hippocampus, a brain region essential for learning and memory, at all the time-points in which inhibiting transcription has been shown to impair memory formation. We show that most of the variance in gene expression is not due to conditioning and that by removing unwanted variance through additional normalization we are able provide novel biological insights. In particular, we show that genes downregulated by memory acquisition and retrieval impact different functions: chromatin assembly and RNA processing, respectively. Levels of histone 2A variant H2AB are reduced only following acquisition, a finding we confirmed using quantitative proteomics. On the other hand, splicing factor Rbfox1 and NMDA receptor-dependent microRNA miR-219 are only downregulated after retrieval, accompanied by an increase in protein levels of miR-219 target CAMKIIγ. Conclusions We provide a thorough characterization of coding and non-coding gene expression during long-term memory formation. We demonstrate that unwanted variance dominates the signal in transcriptional studies of learning and memory and introduce the removal of unwanted variance through normalization as a

  14. Report: CSB Needs to Improve Its Acquisition Approvals and Other Processes to Ensure Best Value for Taxpayers

    EPA Pesticide Factsheets

    Report #15-P-0245, July 31, 2015. CSB's acquisition process is at risk and may have ineffective operations without a strategy to implement controls. Further, CSB has limited evidence it contracted at the best value.

  15. Condom acquisition and preferences within a sample of sexually active gay and bisexual men in the southern United States.

    PubMed

    Rhodes, Scott D; Hergenrather, Kenneth C; Yee, Leland J; Wilkin, Aimee M; Clarke, Thomas L; Wooldredge, Rich; Brown, Monica; Davis, A Bernard

    2007-11-01

    Health departments, community-based organizations (CBOs), and AIDS service organizations (ASOs) in the United States and abroad distribute large quantities of free condoms to sexually active individuals; however, little is known about where individuals who use condoms actually acquire them. This community-based participatory research (CBPR) study was designed to identify factors associated with the use of free condoms during most recent anal intercourse among self-identifying gay and bisexual men who reported condom use. Data were collected using targeted intercept interviewing during North Carolina Pride Festival events in Fall 2006, using the North Carolina Condom Acquisition and Preferences Assessment (NC-CAPA). Of the 606 participants who completed the assessment, 285 met the inclusion criteria. Mean age of participants was 33 (+/-10.8) years. The sample was predominantly white (80%), 50% reported being single or not dating anyone special, and 38% reported the use of free condoms during most recent anal intercourse. In multivariable analysis, participants who reported using free condoms during most recent anal sex were more likely to report increased age; dating someone special or being partnered; and having multiple male sexual partners in the past 3 months. These participants were less likely to report ever having had a sexually transmitted disease. Despite being in the third decade of the HIV epidemic, little is known about condom acquisition among, and condom preferences of, gay and bisexual men who use condoms. Although more research is needed, our findings illustrate the importance of free condom distribution.

  16. Acquisition, development, and maintenance of online poker playing in a student sample.

    PubMed

    Wood, Richard T A; Griffiths, Mark D; Parke, Jonathan

    2007-06-01

    To date there has been very little empirical research into Internet gambling and none relating to the recent rise in popularity of online poker. Given that recent reports have claimed that students may be a vulnerable group, the aim of the current study was to establish basic information regarding Internet poker playing behavior among the student population, including various motivators for participation and predictors of problematic play. The study examined a self-selected sample of student online poker players using an online survey (n=422). Results showed that online poker playing was undertaken at least twice per week by a third of the participants. Almost one in five of the sample (18%) was defined as a problem gambler using the DSM-IV criteria. Findings demonstrated that problem gambling in this population was best predicted by negative mood states after playing, gender swapping whilst playing, and playing to escape from problems.

  17. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  18. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  19. ISOLOK VALVE ACCEPTANCE TESTING FOR DWPF SME SAMPLING PROCESS

    SciTech Connect

    Edwards, T.; Hera, K.; Coleman, C.; Jones, M.; Wiedenman, B.

    2011-12-05

    Evaluation of the Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. Of the opportunities, a focus area related to optimizing the equipment and efficiency of the sample turnaround time for DWPF Analytical Laboratory was identified. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) evaluated the possibility of using an Isolok{reg_sign} sampling valve as an alternative to the Hydragard{reg_sign} valve for taking process samples. Previous viability testing was conducted with favorable results using the Isolok sampler and reported in SRNL-STI-2010-00749 (1). This task has the potential to improve operability, reduce maintenance time and decrease CPC cycle time. This report summarizes the results from acceptance testing which was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 (2) and which was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNL-RP-2011-00145 (3). The Isolok to be tested is the same model which was tested, qualified, and installed in the Sludge Receipt Adjustment Tank (SRAT) sample system. RW-0333P QA requirements apply to this task. This task was to qualify the Isolok sampler for use in the DWPF Slurry Mix Evaporator (SME) sampling process. The Hydragard, which is the current baseline sampling method, was used for comparison to the Isolok sampling data. The Isolok sampler is an air powered grab sampler used to 'pull' a sample volume from a process line. The operation of the sampler is shown in Figure 1. The image on the left shows the Isolok's spool extended into the process line and the image on the right shows the sampler retracted and then dispensing the liquid into the sampling container. To determine tank homogeneity, a Coliwasa sampler was used to grab samples at a high and low location within the mixing tank. Data from the two locations

  20. Thinking About Multiword Constructions: Usage-Based Approaches to Acquisition and Processing.

    PubMed

    Ellis, Nick C; Ogden, Dave C

    2017-02-24

    Usage-based approaches to language hold that we learn multiword expressions as patterns of language from language usage, and that knowledge of these patterns underlies fluent language processing. This paper explores these claims by focusing upon verb-argument constructions (VACs) such as "V(erb) about n(oun phrase)." These are productive constructions that bind syntax, lexis, and semantics. It presents (a) analyses of usage patterns of English VACs in terms of their grammatical form, semantics, lexical constituency, and distribution patterns in large corpora; (b) patterns of VAC usage in child-directed speech and child language acquisition; and (c) investigations of VAC free-association and psycholinguistic studies of online processing. We conclude that VACs are highly patterned in usage, that this patterning drives language acquisition, and that language processing is sensitive to the forms of the syntagmatic construction and their distributional statistics, the contingency of their association with meaning, and spreading activation and prototypicality effects in semantic reference. Language users have rich implicit knowledge of the statistics of multiword sequences.

  1. Apparatus and process for collection of gas and vapor samples

    DOEpatents

    Jackson, Dennis G.; Peterson, Kurt D.; Riha, Brian D.

    2008-04-01

    A gas sampling apparatus and process is provided in which a standard crimping tool is modified by an attached collar. The collar permits operation of the crimping tool while also facilitating the introduction of a supply of gas to be introduced into a storage vial. The introduced gas supply is used to purge ambient air from a collection chamber and an interior of the sample vial. Upon completion of the purging operation, the vial is sealed using the crimping tool.

  2. The birds and the beans: a low-fidelity simulator for chorionic villus sampling skill acquisition.

    PubMed

    Wax, Joseph R; Cartin, Angelina; Pinette, Michael G

    2012-08-01

    Because no simulation models are described for chorionic villus sampling (CVS), we sought to design and construct a CVS training simulator. Using materials available from our labor floor and local supermarket, we built and demonstrated a practical model for learning transabdominal and transcervical CVS. The simulator can be used to teach single- or dual-operator transabdominal CVS and traditional transcervical CVS. Aspirated "villi" immediately inform the teacher and learner of successful procedures. No image degradation or sonographically visible tracks resulted from use, permitting more than one trainee to benefit from a model. This model for transabdominal and transcervical CVS provides realistic imaging, tactile sensations, and immediate feedback.

  3. Trade-offs in data acquisition and processing parameters for backscatter and scatterer size estimations.

    PubMed

    Liu, Wu; Zagzebski, James A

    2010-01-01

    By analyzing backscattered echo signal power spectra and thereby obtaining backscatter coefficient vs. frequency data, the size of subresolution scatterers contributing to echo signals can be estimated. Here we investigate trade-offs in data acquisition and processing parameters for reference phantom-based backscatter and scatterer size estimations. RF echo data from a tissue-mimicking test phantom were acquired using a clinical scanner equipped with linear array transducers. One array has a nominal frequency bandwidth of 5 to 13 MHz and the other 4 to 9 MHz. Comparison of spectral estimation methods showed that the Welch method provided spectra yielding more accurate and precise backscatter coefficient and scatterer size estimations than spectra computed by applying rectangular, Hanning, or Hamming windows and much reduced computational load than if using the multitaper method. For small echo signal data block sizes, moderate improvements in scatterer size estimations were obtained using a multitaper method, but this significantly increases the computational burden. It is critical to average power spectra from lateral A-lines for the improvement of scatterer size estimation. Averaging approximately 10 independent A-lines laterally with an axial window length 10 times the center frequency wavelength optimized trade-offs between spatial resolution and the variance of scatterer size estimates. Applying the concept of a time-bandwidth product, this suggests using analysis blocks that contain at least 30 independent samples of the echo signal. The estimation accuracy and precision depend on the ka range where k is the wave number and a is the effective scatterer size. This introduces a region-of-interest depth dependency to the accuracy and precision because of preferential attenuation of higher frequency sound waves in tissuelike media. With the 5 to 13 MHz, transducer ka ranged from 0.5 to 1.6 for scatterers in the test phantom, which is a favorable range, and the

  4. Acquisition of a High Resolution Field Emission Scanning Electron Microscope for the Analysis of Returned Samples

    NASA Technical Reports Server (NTRS)

    Nittler, Larry R.

    2003-01-01

    This grant furnished funds to purchase a state-of-the-art scanning electron microscope (SEM) to support our analytical facilities for extraterrestrial samples. After evaluating several instruments, we purchased a JEOL 6500F thermal field emission SEM with the following analytical accessories: EDAX energy-dispersive x-ray analysis system with fully automated control of instrument and sample stage; EDAX LEXS wavelength-dispersive x-ray spectrometer for high sensitivity light-element analysis; EDAX/TSL electron backscatter diffraction (EBSD) system with software for phase identification and crystal orientation mapping; Robinson backscatter electron detector; and an in situ micro-manipulator (Kleindiek). The total price was $550,000 (with $150,000 of the purchase supported by Carnegie institution matching funds). The microscope was delivered in October 2002, and most of the analytical accessories were installed by January 2003. With the exception of the wavelength spectrometer (which has been undergoing design changes) everything is working well and the SEM is in routine use in our laboratory.

  5. Sampling design for spatially distributed hydrogeologic and environmental processes

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1992-01-01

    A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related

  6. Early Sexual Debut: A Risk Factor for STIs/HIV Acquisition Among a Nationally Representative Sample of Adults in Nepal.

    PubMed

    Shrestha, Roman; Karki, Pramila; Copenhaver, Michael

    2016-02-01

    While early sexual debut is highly prevalent in Nepal, its link to sexually transmitted infections (STIs/HIV) risk factors has not been explored at a national level. The objective of this study was to assess potential association between early sexual debut and risk factors for STIs/HIV acquisition, including sexual risk behaviors, sexual violence, and teenage pregnancy among adults in Nepal. Data were taken from the nationally representative Nepal Demographic Health Survey (2011), which employed a two-stage complex design to collect data. A sample of 12,756 adults (ages 15-49 years) were included. Multivariate logistic models were conducted, adjusted for demographic characteristics, to assess the association between early sexual debut and STIs/HIV-related risk factors. The prevalence of early sexual debut in this sample was 39.2%, with a mean age of coital debut at 17.9 years. After adjusting for potential confounders, individuals with early sexual debut were significantly more likely to report a history of STIs (aOR 1.19; 95% CI 1.06-1.35) and had a significantly higher risk profile, including having multiple sex partner (aOR 2.14; 95% CI 1.86-2.47), inconsistent condom use (aOR 0.72; 95% CI 0.61-0.86), paid for sex (aOR 1.61; 95% CI 1.14-2.27), a history of sexual violence (aOR 1.99; 95% CI 1.63-2.43), and teenage pregnancy (aOR 12.87; 95% CI 11.62-14.26). Individuals who have early sexual debut are more likely to engage in risk behaviors that place them at increased risk of STIs/HIV acquisition. STIs/HIV prevention strategies should aim at delaying sexual debut to decrease the disproportionate burden of adverse health outcomes, including STIs/HIV, among individuals in Nepal.

  7. An Integrated Data Acquisition / User Request/ Processing / Delivery System for Airborne Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Chapman, B.; Chu, A.; Tung, W.

    2003-12-01

    Airborne science data has historically played an important role in the development of the scientific underpinnings for spaceborne missions. When the science community determines the need for new types of spaceborne measurements, airborne campaigns are often crucial in risk mitigation for these future missions. However, full exploitation of the acquired data may be difficult due to its experimental and transitory nature. Externally to the project, most problematic (in particular, for those not involved in requesting the data acquisitions) may be the difficulty in searching for, requesting, and receiving the data, or even knowing the data exist. This can result in a rather small, insular community of users for these data sets. Internally, the difficulty for the project is in maintaining a robust processing and archival system during periods of changing mission priorities and evolving technologies. The NASA/JPL Airborne Synthetic Aperture Radar (AIRSAR) has acquired data for a large and varied community of scientists and engineers for 15 years. AIRSAR is presently supporting current NASA Earth Science Enterprise experiments, such as the Soil Moisture EXperiment (SMEX) and the Cold Land Processes experiment (CLPX), as well as experiments conducted as many as 10 years ago. During that time, it's processing, data ordering, and data delivery system has undergone evolutionary change as the cost and capability of resources has improved. AIRSAR now has a fully integrated data acquisition/user request/processing/delivery system through which most components of the data fulfillment process communicate via shared information within a database. The integration of these functions has reduced errors and increased throughput of processed data to customers.

  8. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  9. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  10. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  11. Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples

    NASA Technical Reports Server (NTRS)

    Sunshine, Daniel

    2010-01-01

    The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.

  12. 27 CFR 555.184 - Statements of process and samples.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 3 2011-04-01 2010-04-01 true Statements of process and samples. 555.184 Section 555.184 Alcohol, Tobacco Products, and Firearms BUREAU OF ALCOHOL, TOBACCO... regard to any plastic explosive or to any detection agent that is to be introduced into a...

  13. 27 CFR 555.184 - Statements of process and samples.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 3 2010-04-01 2010-04-01 false Statements of process and samples. 555.184 Section 555.184 Alcohol, Tobacco Products, and Firearms BUREAU OF ALCOHOL, TOBACCO... regard to any plastic explosive or to any detection agent that is to be introduced into a...

  14. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    DTIC Science & Technology

    2014-07-01

    policy document entitled The National Strategy for Biosurveillance was released (White House, July 2012) as part of the National Security Strategy...concept of leveraging existing capabilities to “scan and discern the environment,” which implies the use of current technical biosurveillance ...testing of existing sample-processing technologies are expected to enable in silico evaluations of biosurveillance methodologies, equipment, and

  15. Challenging genosensors in food samples: The case of gluten determination in highly processed samples.

    PubMed

    Martín-Fernández, Begoña; de-los-Santos-Álvarez, Noemí; Martín-Clemente, Juan Pedro; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2016-01-01

    Electrochemical genosensors have undergone an enormous development in the last decades, but only very few have achieved a quantification of target content in highly processed food samples. The detection of allergens, and particularly gluten, is challenging because legislation establishes a threshold of 20 ppm for labeling as gluten-free but most genosensors expresses the results in DNA concentration or DNA copies. This paper describes the first attempt to correlate the genosensor response and the wheat content in real samples, even in the case of highly processed food samples. A sandwich-based format, comprising a capture probe immobilized onto the screen-printed gold electrode, and a signaling probe functionalized with fluorescein isothiocyanate (FITC), both hybridizing with the target was used. The hybridization event was electrochemically monitored by adding an anti-FITC peroxidase (antiFITC-HRP) and its substrate, tetramethylbenzidine. Binary model mixtures, as a reference material, and real samples have been analyzed. DNA from food was extracted and a fragment encoding the immunodominant peptide of α2-gliadin amplified by a tailored PCR. The sensor was able to selectively detect toxic cereals for celiac patients, such as different varieties of wheat, barley, rye and oats, from non-toxic plants. As low as 0.001% (10 mg/kg) of wheat flour in an inert matrix was reliably detected, which directly compete with the current method of choice for DNA detection, the real-time PCR. A good correlation with the official immunoassay was found in highly processed food samples.

  16. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  17. A review of breast tomosynthesis. Part I. The image acquisition process

    SciTech Connect

    Sechopoulos, Ioannis

    2013-01-15

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

  18. Acquisition of material properties in production for sheet metal forming processes

    SciTech Connect

    Heingärtner, Jörg; Hora, Pavel; Neumann, Anja; Hortig, Dirk; Rencki, Yasar

    2013-12-16

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  19. The Influence of Working Memory and Phonological Processing on English Language Learner Children's Bilingual Reading and Language Acquisition

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Orosco, Michael J.; Lussier, Cathy M.; Gerber, Michael M.; Guzman-Orth, Danielle A.

    2011-01-01

    In this study, we explored whether the contribution of working memory (WM) to children's (N = 471) 2nd language (L2) reading and language acquisition was best accounted for by processing efficiency at a phonological level and/or by executive processes independent of phonological processing. Elementary school children (Grades 1, 2, & 3) whose…

  20. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    SciTech Connect

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  1. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  2. Optimizing hippocampal segmentation in infants utilizing MRI post-acquisition processing.

    PubMed

    Thompson, Deanne K; Ahmadzai, Zohra M; Wood, Stephen J; Inder, Terrie E; Warfield, Simon K; Doyle, Lex W; Egan, Gary F

    2012-04-01

    This study aims to determine the most reliable method for infant hippocampal segmentation by comparing magnetic resonance (MR) imaging post-acquisition processing techniques: contrast to noise ratio (CNR) enhancement, or reformatting to standard orientation. MR scans were performed with a 1.5 T GE scanner to obtain dual echo T2 and proton density (PD) images at term equivalent (38-42 weeks' gestational age). 15 hippocampi were manually traced four times on ten infant images by 2 independent raters on the original T2 image, as well as images processed by: a) combining T2 and PD images (T2-PD) to enhance CNR; then b) reformatting T2-PD images perpendicular to the long axis of the left hippocampus. CNRs and intraclass correlation coefficients (ICC) were calculated. T2-PD images had 17% higher CNR (15.2) than T2 images (12.6). Original T2 volumes' ICC was 0.87 for rater 1 and 0.84 for rater 2, whereas T2-PD images' ICC was 0.95 for rater 1 and 0.87 for rater 2. Reliability of hippocampal segmentation on T2-PD images was not improved by reformatting images (rater 1 ICC = 0.88, rater 2 ICC = 0.66). Post-acquisition processing can improve CNR and hence reliability of hippocampal segmentation in neonate MR scans when tissue contrast is poor. These findings may be applied to enhance boundary definition in infant segmentation for various brain structures or in any volumetric study where image contrast is sub-optimal, enabling hippocampal structure-function relationships to be explored.

  3. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  4. Bimeasures and Sampling Theorems for Weakly Harmonizable Processes.

    DTIC Science & Technology

    1982-09-27

    representation theorem and then such a measure has a unique extension of being a Radon measure by the standard theory of Bourbaki (1], one refers to each such...nontrivial, and this approach has other drawbacks. We therefore do not consider this set in the general format of the sampling theory of random processes...for a more general Cramer class. To carry out this analysis, it is neces- sary to use the properties of bimeasures. Some aspects of the bimeasure theory

  5. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  6. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  7. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    NASA Astrophysics Data System (ADS)

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, A. M.; Li, L.; Kao, C.-M.

    2015-06-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 T small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2×8 LYSO scintillators (5.0×5.0×10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner.

  8. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition.

    PubMed

    Kim, H; Chen, C-T; Eclov, N; Ronzhin, A; Murat, P; Ramberg, E; Los, S; Wyrwicz, Alice M; Li, Limin; Kao, C-M

    2015-06-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 Tesla small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2 × 8 LYSO scintillators (5.0 × 5.0 × 10.0 mm(3) with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner.

  9. A feasibility study of a PET/MRI insert detector using strip-line and waveform sampling data acquisition

    PubMed Central

    Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Wyrwicz, Alice M.; Li, Limin; Kao, C.-M.

    2014-01-01

    We are developing a time-of-flight Positron Emission Tomography (PET) detector by using silicon photo-multipliers (SiPM) on a strip-line and high speed waveform sampling data acquisition. In this design, multiple SiPMs are connected on a single strip-line and signal waveforms on the strip-line are sampled at two ends of the strip to reduce readout channels while fully exploiting the fast time response of SiPMs. In addition to the deposited energy and time information, the position of the hit SiPM along the strip-line is determined by the arrival time difference of the waveform. Due to the insensitivity of the SiPMs to magnetic fields and the compact front-end electronics, the detector approach is highly attractive for developing a PET insert system for a magnetic resonance imaging (MRI) scanner to provide simultaneous PET/MR imaging. To investigate the feasibility, experimental tests using prototype detector modules have been conducted inside a 9.4 Tesla small animal MRI scanner (Bruker BioSpec 94/30 imaging spectrometer). On the prototype strip-line board, 16 SiPMs (5.2 mm pitch) are installed on two strip-lines and coupled to 2 × 8 LYSO scintillators (5.0 × 5.0 × 10.0 mm3 with 5.2 mm pitch). The outputs of the strip-line boards are connected to a Domino-Ring-Sampler (DRS4) evaluation board for waveform sampling. Preliminary experimental results show that the effect of interference on the MRI image due to the PET detector is negligible and that PET detector performance is comparable with the results measured outside the MRI scanner. PMID:25937685

  10. Data acquisition and processing system of the electron cyclotron emission imaging system of the KSTAR tokamak.

    PubMed

    Kim, J B; Lee, W; Yun, G S; Park, H K; Domier, C W; Luhmann, N C

    2010-10-01

    A new innovative electron cyclotron emission imaging (ECEI) diagnostic system for the Korean Superconducting Tokamak Advanced Research (KSTAR) produces a large amount of data. The design of the data acquisition and processing system of the ECEI diagnostic system should consider covering the large data production and flow. The system design is based on the layered structure scalable to the future extension to accommodate increasing data demands. Software architecture that allows a web-based monitoring of the operation status, remote experiment, and data analysis is discussed. The operating software will help machine operators and users validate the acquired data promptly, prepare next discharge, and enhance the experiment performance and data analysis in a distributed environment.

  11. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  12. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  13. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  14. How does the interaction between spelling and motor processes build up during writing acquisition?

    PubMed

    Kandel, Sonia; Perret, Cyril

    2015-03-01

    How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent.

  15. Ultrasonic microdevices for integrated on-chip biological sample processing

    NASA Astrophysics Data System (ADS)

    Dougherty, George Michael

    Integrated lab-on-a-chip devices, also known as micro total analysis systems (mu-TAS), are expected to play a leading role in biological research and medicine in the 21st century, and on-chip sample processing is a key function of such devices. A new class of ultrasonic microfluidic sample processing devices is presented, based on a single common fundamental unit---a capacitive micromachined ultrasonic transducer---and fabricated using a single common process. Arrays of the transducers are integrated with fluidic microchannels, allowing devices with different functions to be realized simply by altering the physical arrangement and electrical drive signals of the array elements. The efficient, in-plane manipulation of particle-laden liquids is achieved by the use of phased, co-planar transducers, allowing the generation of in-plane, cavity-mode standing waves in the microchannels, and permitting the efficient manipulation of suspended particles such as cells by acoustic radiation forces. Fabricated prototype devices include several types of ultrasonic particle filters, flow-through particle fractionators, particle collimators for cell alignment, devices for the ultrasonic lysing of cells, ultrasonic pumps and ultrasonic mixers. As part of the development effort, an investigation of the thin film silicon material known as "permeable polysilicon" was performed, resulting in the discovery that the material's liquid-permeability properties are caused by nanoscale pores that form spontaneously within an unusual morphological growth regime. A new, one-step porous polysilicon process is presented that allows the quick and easy fabrication of porous polysilicon films for a wide range of applications. The process is used to fabricate the ultrasonic immersion transducers used in the device arrays, and allows the convenient fabrication of a wide variety of microstructures that would be difficult or impossible to fabricate by other means. In addition, a new simulation code is

  16. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.

    2017-01-01

    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and

  17. Durability patch and damage dosimeter: a portable battery-powered data acquisition computer and durability patch design process

    NASA Astrophysics Data System (ADS)

    Haugse, Eric D.; Johnson, Patrick E.; Smith, David L.; Rogers, Lynn C.

    2000-05-01

    Repairs of secondary structure can be accomplished by restoring structural integrity at the damaged area and increasing the structure's damping in the repair region. Increased damping leads to a reduction in resonant response and a repair that will survive for the life of the aircraft. In order to design a repair with effective damping properties, the in-service structural strains and temperatures must be known. A rugged, small and lightweight data acquisition unit called the Damage Dosimeter has been developed to accomplish this task with minimal impact to the aircraft system. Running autonomously off of battery power, the Damage Dosimeter measures three channels of strain at sample rates as high as 15 kilo-samples per second and a single channel of temperature. It merges the functionality of both analog signal conditioning and a digital single board computer on one 3.5 by 5 inch card. The Damage Dosimeter allows an engineer to easily instrument an in-service aircraft to assess the structural response characteristics necessary to properly select damping materials. This information in conjunction with analysis and design procedures can be used to design a repair with optimum effectiveness. This paper will present the motivation behind the development of the Damage Dosimeter along with an overview of its functional capabilities and design. In-service flight data and analysis results will be discussed for two applications. The paper will also describe how the Damage Dosimeter is used to enable the Durability Patch design process.

  18. Laser airborne remote sensing real-time acquisition, processing, and control system

    NASA Astrophysics Data System (ADS)

    Kelly, Brian T.; Pierson, Robert E.; Dropka, T. J.; Dowling, James A.; Lang, L. M.; Fox, Marsha J.

    1997-10-01

    The US Air Force Phillips Laboratory is evaluating the feasibility of long-standoff-range remote sensing of gaseous species present in trace amounts in the atmosphere. Extensive system integration in the laboratory and an airborne test are leading to remote sensing ground test and airborne missions within the next year. This paper describes the design, external interfaces. and initial performance of the Laser Airborne Remote Sensing acquisition, processing, and control system to be deployed on the Phillips Laboratory NC-135 research aircraft for differential absorption lidar system performance tests. The dual-CPU VME-based real-time computer system synchronizes experiment timing and pulsed CO2 laser operation up to 30 Hz while controlling optical subsystem components such as a laser grating, receiver gain, mirror alignment, and laser shutters. This real-time system acquires high rate detector signals from the outgoing and return laser pulses as well as a low rate health and status signals form the optical bench and the aircraft. Laser pulse and status data are processed and displayed in real time on one of four graphical user interfaces: one devoted to system control, one to remote mirror alignment, and two other interfaces for real-time data analysis and diagnostics. The dual-CPU and multi- layered software decouple time critical and non-critical tasks allowing great flexibility in flight-time display and processing.

  19. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGES

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  20. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  1. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    SciTech Connect

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  2. Experiment kits for processing biological samples inflight on SLS-2

    NASA Technical Reports Server (NTRS)

    Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.

    1995-01-01

    This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.

  3. Software Radio Processing of Sampled Bistatic Reflected GPS Data

    NASA Astrophysics Data System (ADS)

    Heckler, G.; Garrison, J. L.

    2001-12-01

    The software receiver approach to reflected GPS signal processing offers several advantages over delay-mapping receivers which employ real time processing with hardware correlators. These include the ability to replay the same segment of data using different integration times, or correlator spacing, and the flexibility of large arrays of correlators to map the complete glistening surface of all visible satellites. The disadvantage of software receivers is the need to record sampled data at higher than the Nyquist frequency, which results in the generation of a large volume of data. These large data sets will consequently require substantial computer processing requirements. A software receiver toolbox is being developed at Purdue University, in the MATLAB language, for the post-processing and visualization of reflected GPS waveforms. Application of this toolbox to airborne data, using the C/A code, will be demonstrated. Additionally, the interface of this post-processed data with retrieval algorithms developed for DMR data will be discussed. Use of the 272 processor IBM SP supercomputer for the reduction of larger data sets will also be described.

  4. Adaptive sampling for learning gaussian processes using mobile sensor networks.

    PubMed

    Xu, Yunfei; Choi, Jongeun

    2011-01-01

    This paper presents a novel class of self-organizing sensing agents that adaptively learn an anisotropic, spatio-temporal gaussian process using noisy measurements and move in order to improve the quality of the estimated covariance function. This approach is based on a class of anisotropic covariance functions of gaussian processes introduced to model a broad range of spatio-temporal physical phenomena. The covariance function is assumed to be unknown a priori. Hence, it is estimated by the maximum a posteriori probability (MAP) estimator. The prediction of the field of interest is then obtained based on the MAP estimate of the covariance function. An optimal sampling strategy is proposed to minimize the information-theoretic cost function of the Fisher Information Matrix. Simulation results demonstrate the effectiveness and the adaptability of the proposed scheme.

  5. Adaptive Sampling for Learning Gaussian Processes Using Mobile Sensor Networks

    PubMed Central

    Xu, Yunfei; Choi, Jongeun

    2011-01-01

    This paper presents a novel class of self-organizing sensing agents that adaptively learn an anisotropic, spatio-temporal Gaussian process using noisy measurements and move in order to improve the quality of the estimated covariance function. This approach is based on a class of anisotropic covariance functions of Gaussian processes introduced to model a broad range of spatio-temporal physical phenomena. The covariance function is assumed to be unknown a priori. Hence, it is estimated by the maximum a posteriori probability (MAP) estimator. The prediction of the field of interest is then obtained based on the MAP estimate of the covariance function. An optimal sampling strategy is proposed to minimize the information-theoretic cost function of the Fisher Information Matrix. Simulation results demonstrate the effectiveness and the adaptability of the proposed scheme. PMID:22163785

  6. Processing of syllable stress is functionally different from phoneme processing and does not profit from literacy acquisition

    PubMed Central

    Schild, Ulrike; Becker, Angelika B. C.; Friedrich, Claudia K.

    2014-01-01

    Speech is characterized by phonemes and prosody. Neurocognitive evidence supports the separate processing of each type of information. Therefore, one might suggest individual development of both pathways. In this study, we examine literacy acquisition in middle childhood. Children become aware of the phonemes in speech at that time and refine phoneme processing when they acquire an alphabetic writing system. We test whether an enhanced sensitivity to phonemes in middle childhood extends to other aspects of the speech signal, such as prosody. To investigate prosodic processing, we used stress priming. Spoken stressed and unstressed syllables (primes) preceded spoken German words with stress on the first syllable (targets). We orthogonally varied stress overlap and phoneme overlap between the primes and onsets of the targets. Lexical decisions and Event-Related Potentials (ERPs) for the targets were obtained for pre-reading preschoolers, reading pupils and adults. The behavioral and ERP results were largely comparable across all groups. The fastest responses were observed when the first syllable of the target word shared stress and phonemes with the preceding prime. ERP stress priming and ERP phoneme priming started 200 ms after the target word onset. Bilateral ERP stress priming was characterized by enhanced ERP amplitudes for stress overlap. Left-lateralized ERP phoneme priming replicates previously observed reduced ERP amplitudes for phoneme overlap. Groups differed in the strength of the behavioral phoneme priming and in the late ERP phoneme priming effect. The present results show that enhanced phonological processing in middle childhood is restricted to phonemes and does not extend to prosody. These results are indicative of two parallel processing systems for phonemes and prosody that might follow different developmental trajectories in middle childhood as a function of alphabetic literacy. PMID:24917838

  7. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  8. A computational model associating learning process, word attributes, and age of acquisition.

    PubMed

    Hidaka, Shohei

    2013-01-01

    We propose a new model-based approach linking word learning to the age of acquisition (AoA) of words; a new computational tool for understanding the relationships among word learning processes, psychological attributes, and word AoAs as measures of vocabulary growth. The computational model developed describes the distinct statistical relationships between three theoretical factors underpinning word learning and AoA distributions. Simply put, this model formulates how different learning processes, characterized by change in learning rate over time and/or by the number of exposures required to acquire a word, likely result in different AoA distributions depending on word type. We tested the model in three respects. The first analysis showed that the proposed model accounts for empirical AoA distributions better than a standard alternative. The second analysis demonstrated that the estimated learning parameters well predicted the psychological attributes, such as frequency and imageability, of words. The third analysis illustrated that the developmental trend predicted by our estimated learning parameters was consistent with relevant findings in the developmental literature on word learning in children. We further discuss the theoretical implications of our model-based approach.

  9. Defense Acquisitions: How DOD Acquires Weapon Systems and Recent Efforts to Reform the Process

    DTIC Science & Technology

    2009-07-10

    developmental opportunities and establishing clear acquisitions career paths; (3) increasing the number of federal employees in critical skill areas; and (4... astrophysics and a host of other fields.79 In sum, a number of factors have contributed to defense acquisitions becoming a significant issue in the

  10. Sample processing obscures cancer-specific alterations in leukemic transcriptomes

    PubMed Central

    Dvinge, Heidi; Ries, Rhonda E.; Ilagan, Janine O.; Stirewalt, Derek L.; Meshinchi, Soheil

    2014-01-01

    Substantial effort is currently devoted to identifying cancer-associated alterations using genomics. Here, we show that standard blood collection procedures rapidly change the transcriptional and posttranscriptional landscapes of hematopoietic cells, resulting in biased activation of specific biological pathways; up-regulation of pseudogenes, antisense RNAs, and unannotated coding isoforms; and RNA surveillance inhibition. Affected genes include common mutational targets and thousands of other genes participating in processes such as chromatin modification, RNA splicing, T- and B-cell activation, and NF-κB signaling. The majority of published leukemic transcriptomes exhibit signals of this incubation-induced dysregulation, explaining up to 40% of differences in gene expression and alternative splicing between leukemias and reference normal transcriptomes. The effects of sample processing are particularly evident in pan-cancer analyses. We provide biomarkers that detect prolonged incubation of individual samples and show that keeping blood on ice markedly reduces changes to the transcriptome. In addition to highlighting the potentially confounding effects of technical artifacts in cancer genomics data, our study emphasizes the need to survey the diversity of normal as well as neoplastic cells when characterizing tumors. PMID:25385641

  11. A study of the influence of the data acquisition system sampling rate on the accuracy of measured acceleration loads for transport aircraft

    NASA Technical Reports Server (NTRS)

    Whitehead, Julia H.

    1992-01-01

    A research effort was initiated at National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC), to describe the relationship between the sampling rate and the accuracy of acceleration loads obtained from the data acquisition system of a transport aircraft. An accelerometer was sampled and digitized at a rate of 100 samples per second onboard a NASA Boeing 737 (B-737) flight research aircraft. Numerical techniques were used to reconstruct 2.5 hours of flight data into its original input waveform and then re-sample the waveform into rates of 4, 8, 16, and 32 samples per second. Peak-between-means counting technique and power spectral analysis were used to evaluate each sampling rate using the 32 samples per second data as the comparison. This paper presents the results from these methods and includes in appendix A, the peak-between-means counting results used in a general fatigue analysis for each of the sampling rates.

  12. Birds of a Feather: Moving Towards a Joint Acquisition Process to Support the Intelligence, Surveillance and Reconnaissance (ISR) Enterprise

    DTIC Science & Technology

    2009-04-01

    AU/ACSC/9694/2008-09 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY BIRDS OF A FEATHER Moving Towards a Joint Acquisition Process to Support the...number. 1. REPORT DATE APR 2009 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE BIRDS OF A FEATHER: Moving Towards a Joint...INTRODUCTION ...........................................................................................................................1 " Birds of a

  13. The Effect of Processing Instruction and Meaning-Based Output Instruction on the Acquisition of Japanese Honorific Expressions

    ERIC Educational Resources Information Center

    Fukuda, Makiko

    2009-01-01

    The present study investigates the relative effects of processing instruction (PI) and meaning-based output instruction (MOI) on the acquisition of Japanese honorific expressions. The PI and MOI treatments were designed to be identical except for the practice mode (input vs. output) and treatments were provided via computer-based materials.…

  14. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  15. The Comparative Effects of Processing Instruction and Dictogloss on the Acquisition of the English Passive by Speakers of Turkish

    ERIC Educational Resources Information Center

    Uludag, Onur; Vanpatten, Bill

    2012-01-01

    The current study presents the results of an experiment investigating the effects of processing instruction (PI) and dictogloss (DG) on the acquisition of the English passive voice. Sixty speakers of Turkish studying English at university level were assigned to three groups: one receiving PI, the other receiving DG and the third serving as a…

  16. Combining Contextual and Morphemic Cues Is Beneficial during Incidental Vocabulary Acquisition: Semantic Transparency in Novel Compound Word Processing

    ERIC Educational Resources Information Center

    Brusnighan, Stephen M.; Folk, Jocelyn R.

    2012-01-01

    In two studies, we investigated how skilled readers use contextual and morphemic information in the process of incidental vocabulary acquisition during reading. In Experiment 1, we monitored skilled readers' eye movements while they silently read sentence pairs containing novel and known English compound words that were either semantically…

  17. Developmental trends in auditory processing can provide early predictions of language acquisition in young infants.

    PubMed

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R; Shao, Jie; Lozoff, Betsy

    2013-03-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory - Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children.

  18. An underground tale: contribution of microbial activity to plant iron acquisition via ecological processes

    PubMed Central

    Jin, Chong Wei; Ye, Yi Quan; Zheng, Shao Jian

    2014-01-01

    Background Iron (Fe) deficiency in crops is a worldwide agricultural problem. Plants have evolved several strategies to enhance Fe acquisition, but increasing evidence has shown that the intrinsic plant-based strategies alone are insufficient to avoid Fe deficiency in Fe-limited soils. Soil micro-organisms also play a critical role in plant Fe acquisition; however, the mechanisms behind their promotion of Fe acquisition remain largely unknown. Scope This review focuses on the possible mechanisms underlying the promotion of plant Fe acquisition by soil micro-organisms. Conclusions Fe-deficiency-induced root exudates alter the microbial community in the rhizosphere by modifying the physicochemical properties of soil, and/or by their antimicrobial and/or growth-promoting effects. The altered microbial community may in turn benefit plant Fe acquisition via production of siderophores and protons, both of which improve Fe bioavailability in soil, and via hormone generation that triggers the enhancement of Fe uptake capacity in plants. In addition, symbiotic interactions between micro-organisms and host plants could also enhance plant Fe acquisition, possibly including: rhizobium nodulation enhancing plant Fe uptake capacity and mycorrhizal fungal infection enhancing root length and the nutrient acquisition area of the root system, as well as increasing the production of Fe3+ chelators and protons. PMID:24265348

  19. Meteoceanographic premises for structural design purposes in the Adriatic Sea: Acquisition and processing of data

    SciTech Connect

    Rampolli, M.; Biancardi, A.; Filippi, G. De

    1996-12-31

    In 1993 the leading international standards (ISO, APOI RP2A) for the design of offshore structures drastically changed the procedure for the definition of hydrodynamic forces. In particular oil companies are required to have a detailed knowledge of the weather of the areas where they operate, if they want to maintain the previous results. Alternatively, more conservative hydrodynamic forces must be considered in the design phase. Such an increase, valuable in 20--30% of total hydrodynamic force, means heavier platform structures in new projects, and more critical elements to be inspected in existing platforms. In 1992, in order to have more reliable and safe transports to and from the platforms, Agip installed a meteo-marine sensor network in Adriatic Sea, on 13 of the over 80 producing platforms. Data collected are sent to shore via radio and operators can use real time data or 12-hour wave forecast, obtained by a statistic forecasting model. Taking advantage by such existing instruments, a project was undertaken in 1993 with the purpose of determining the extreme environmental parameters to be used by structural engineers. The network has been upgraded in order to achieve directional information of the waves and to permit short term analysis. This paper describes the data acquisition system, data processing and the achieved results.

  20. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  1. Micro-MRI-based image acquisition and processing system for assessing the response to therapeutic intervention

    NASA Astrophysics Data System (ADS)

    Vasilić, B.; Ladinsky, G. A.; Saha, P. K.; Wehrli, F. W.

    2006-03-01

    Osteoporosis is the cause of over 1.5 million bone fractures annually. Most of these fractures occur in sites rich in trabecular bone, a complex network of bony struts and plates found throughout the skeleton. The three-dimensional structure of the trabecular bone network significantly determines mechanical strength and thus fracture resistance. Here we present a data acquisition and processing system that allows efficient noninvasive assessment of trabecular bone structure through a "virtual bone biopsy". High-resolution MR images are acquired from which the trabecular bone network is extracted by estimating the partial bone occupancy of each voxel. A heuristic voxel subdivision increases the effective resolution of the bone volume fraction map and serves a basis for subsequent analysis of topological and orientational parameters. Semi-automated registration and segmentation ensure selection of the same anatomical location in subjects imaged at different time points during treatment. It is shown with excerpts from an ongoing clinical study of early post-menopausal women, that significant reduction in network connectivity occurs in the control group while the structural integrity is maintained in the hormone replacement group. The system described should be suited for large-scale studies designed to evaluate the efficacy of therapeutic intervention in subjects with metabolic bone disease.

  2. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    SciTech Connect

    Alai, M; Glascoe, L; Love, A; Johnson, M; Einfeld, W

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) was to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.

  3. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary.

  4. A new mapping acquisition and processing system for simultaneous PIXE-RBS analysis with external beam

    NASA Astrophysics Data System (ADS)

    Pichon, L.; Beck, L.; Walter, Ph.; Moignard, B.; Guillou, T.

    2010-06-01

    The combination of ion beam analysis techniques is particularly fruitful for the study of cultural heritage objects. For several years, the AGLAE facility of the Louvre laboratory has been implementing these techniques with an external beam. The recent set-up permits to carry out PIXE, PIGE and RBS simultaneously on the same analyzed spot with a particle beam of approximately 20 μm diameter. A new mapping system has been developed in order to provide elemental concentration maps from the PIXE and RBS spectra. This system combines the Genie2000 spectroscopy software with a homemade software that creates maps by handling acquisition with the object position. Each pixel of each PIXE and RBS maps contains the spectrum normalised by the dose. After analysing each pixel of the PIXE maps (low and high energy X-ray spectra) with the Gupixwin peak-fitting software, quantitative elemental concentrations are obtained for the major and trace elements. This paper presents the quantitative elemental maps extracted from the PIXE spectra and the development of RBS data processing for light element distribution and thin layer characterization. Examples on rock painting and lustrous ceramics will be presented.

  5. Sample processing for earth science studies at ANTARES

    NASA Astrophysics Data System (ADS)

    Child, D.; Elliott, G.; Mifsud, C.; Smith, A. M.; Fink, D.

    2000-10-01

    AMS studies in earth sciences at ANTARES, ANSTO created a need for the processing of mineral and ice samples for 10Be, 26Al and 36Cl target preparation. Published procedures have been adapted to our requirements and improved upon where necessary. In particular, new methods to isolate Be with reproducible, high recoveries in the presence of excess Al and Ti were achieved. An existing elution scheme for a cation exchange column procedure was modified to incorporate the use of a 0.25 M H2SO4+0.015% H 2O2 washing step to elute the Ti peroxide complex formed. Problems with dust contamination in ice contributing to measured 10Be signals are also addressed and a procedure developed for its removal.

  6. SAMPLING DEVICE FOR pH MEASUREMENT IN PROCESS STREAMS

    DOEpatents

    Michelson, C.E.; Carson, W.N. Jr.

    1958-11-01

    A pH cell is presented for monitoring the hydrogen ion concentration of a fluid in a process stream. The cell is made of glass with a side entry arm just above a reservoir in which the ends of a glass electrode and a reference electrode are situated. The glass electrode contains the usual internal solution which is connected to a lead. The reference electrode is formed of saturated calomel having a salt bridge in its bottom portion fabricated of a porous glass to insure low electrolyte flow. A flush tube leads into the cell through which buffer and flush solutions are introduced. A ground wire twists about both electrode ends to insure constant electrical grounding of the sample. The electrode leads are electrically connected to a pH meter of aay standard type.

  7. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  8. Defense Acquisitions: How DOD Acquires Weapon Systems and Recent Efforts to Reform the Process

    DTIC Science & Technology

    2010-04-23

    and specialists from industry; (2) developing improved personnel developmental opportunities and establishing clear acquisitions career paths; (3...interests into new realms such as computers, communications, spaceflight, microelectronics, astrophysics and a host of other fields.80 In sum, a number of

  9. 78 FR 61113 - Acquisition Process: Task and Delivery Order Contracts, Bundling, Consolidation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... center representatives (PCRs). The proposed rule provided that SBA's PCR may review acquisitions...) identifies the negative impact on small businesses. The proposed rule also required SBA's PCR to work...

  10. Improved Seismic Acquisition System and Data Processing for the Italian National Seismic Network

    NASA Astrophysics Data System (ADS)

    Badiali, L.; Marcocci, C.; Mele, F.; Piscini, A.

    2001-12-01

    A new system for acquiring and processing digital signals has been developed in the last few years at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). The system makes extensive use of the internet communication protocol standards such as TCP and UDP which are used as the transport highway inside the Italian network, and possibly in a near future outside, to share or redirect data among processes. The Italian National Seismic Network has been working for about 18 years equipped with vertical short period seismometers and transmitting through analog lines, to the computer center in Rome. We are now concentrating our efforts on speeding the migration towards a fully digital network based on about 150 stations equipped with either broad band or 5 seconds sensors connected to the data center partly through wired digital communication and partly through satellite digital communication. The overall process is layered through intranet and/or internet. Every layer gathers data in a simple format and provides data in a processed format, ready to be distributed towards the next layer. The lowest level acquires seismic data (raw waveforms) coming from the remote stations. It handshakes, checks and sends data in LAN or WAN according to a distribution list where other machines with their programs are waiting for. At the next level there are the picking procedures, or "pickers", on a per instrument basis, looking for phases. A picker spreads phases, again through the LAN or WAN and according to a distribution list, to one or more waiting locating machines tuned to generate a seismic event. The event locating procedure itself, the higher level in this stack, can exchange information with other similar procedures. Such a layered and distributed structure with nearby targets allows other seismic networks to join the processing and data collection of the same ongoing event, creating a virtual network larger than the original one. At present we plan to cooperate with other

  11. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA.

  12. Wearable system for acquisition, processing and storage of the signal from amperometric glucose sensors.

    PubMed

    Fabietti, P G; Massi Benedetti, M; Bronzo, F; Reboldi, G P; Sarti, E; Brunetti, P

    1991-03-01

    A wearable device for the acquisition, processing and storage of the signal from needle-type glucose sensors has been designed and developed as part of a project aimed at developing a portable artificial pancreas. The device is essential to assess the operational characteristics of miniaturized sensors in vivo. It can be connected to sensors operating at a constant potential of 0.65 Volts, and generating currents in the order of 10(-9) Amp. It is screened and equipped with filters that permit data recording and processing even in the presence of electrical noise. It can operate with sensors with different characteristics (1-200 nA full scale). The device has been designed to be worn by patients, so its weight and size have been kept to a minimum (250 g; 8.5 x 14.5 x 3.5 cm). It is powered by rechargeable Ni/Cd batteries allowing continuous operation for 72 h. The electronics consists of an analog card with operational amplifiers, and a digital one with a microprocessor (Intel 80C196, MCS-96 class, with internal 16-bit CPU supporting programs written in either C or Assembler language), a 32 Kb EPROM, and an 8 Kb RAM where the data are stored. The microprocessor can run either at 5 or 10 Mhz and features on-chip peripherals: an analog/digital (A/D) converter, a serial port (used to transfer data to a Personal Computer at the end of the 72 h), input-output (I/O) units at high-speed, and two timers. The device is programmed and prepared to operate by means of a second hand-held unit equipped with an LCD display and a 16-key numeric pad.(ABSTRACT TRUNCATED AT 250 WORDS)

  13. Automated Force Volume Image Processing for Biological Samples

    PubMed Central

    Duan, Junbo; Duval, Jérôme F. L.; Brie, David; Francius, Grégory

    2011-01-01

    Atomic force microscopy (AFM) has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature) which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image. PMID:21559483

  14. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    SciTech Connect

    Coleman, C.J.; Goode, S.R.

    1996-05-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition.

  15. Second Language Acquisition and the Pidgin-Creole-Decreolization Cycle: A Comparison of Some Linguistic Processes. Working Papers in Linguistics, Vol. 8, No. 4, October-December, 1976.

    ERIC Educational Resources Information Center

    Huebner, Thomas G.

    Linguists of various theoretical backgrounds have likened second language (L2) acquisition to pidginization (Ferguson 1971, Richards 1971, Bickerton 1975a). This paper examines these two processes and suggests areas where a study of the process of second language acquisition in a natural setting might contribute insights to a general theory of…

  16. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  17. Fast acquisition of high resolution 4-D amide-amide NOESY with diagonal suppression, sparse sampling and FFT-CLEAN.

    PubMed

    Werner-Allen, Jon W; Coggins, Brian E; Zhou, Pei

    2010-05-01

    Amide-amide NOESY provides important distance constraints for calculating global folds of large proteins, especially integral membrane proteins with beta-barrel folds. Here, we describe a diagonal-suppressed 4-D NH-NH TROSY-NOESY-TROSY (ds-TNT) experiment for NMR studies of large proteins. The ds-TNT experiment employs a spin state selective transfer scheme that suppresses diagonal signals while providing TROSY optimization in all four dimensions. Active suppression of the strong diagonal peaks greatly reduces the dynamic range of observable signals, making this experiment particularly suitable for use with sparse sampling techniques. To demonstrate the utility of this method, we collected a high resolution 4-D ds-TNT spectrum of a 23kDa protein using randomized concentric shell sampling (RCSS), and we used FFT-CLEAN processing for further reduction of aliasing artifacts - the first application of these techniques to a NOESY experiment. A comparison of peak parameters in the high resolution 4-D dataset with those from a conventionally-sampled 3-D control spectrum shows an accurate reproduction of NOE crosspeaks in addition to a significant reduction in resonance overlap, which largely eliminates assignment ambiguity. Likewise, a comparison of 4-D peak intensities and volumes before and after application of the CLEAN procedure demonstrates that the reduction of aliasing artifacts by CLEAN does not systematically distort NMR signals.

  18. Proceedings of the XIIIth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition, and Processing

    USGS Publications Warehouse

    Love, Jeffrey J.

    2009-01-01

    The thirteenth biennial International Association of Geomagnetism and Aeronomy (IAGA) Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing was held in the United States for the first time on June 9-18, 2008. Hosted by the U.S. Geological Survey's (USGS) Geomagnetism Program, the workshop's measurement session was held at the Boulder Observatory and the scientific session was held on the campus of the Colorado School of Mines in Golden, Colorado. More than 100 participants came from 36 countries and 6 continents. Preparation for the workshop began when the USGS Geomagnetism Program agreed, at the close of the twelfth workshop in Belsk Poland in 2006, to host the next workshop. Working under the leadership of Alan Berarducci, who served as the chairman of the local organizing committee, and Tim White, who served as co-chairman, preparations began in 2007. The Boulder Observatory was extensively renovated and additional observation piers were installed. Meeting space on the Colorado School of Mines campus was arranged, and considerable planning was devoted to managing the many large and small issues that accompany an international meeting. Without the devoted efforts of both Alan and Tim, other Geomagnetism Program staff, and our partners at the Colorado School of Mines, the workshop simply would not have occurred. We express our thanks to Jill McCarthy, the USGS Central Region Geologic Hazards Team Chief Scientist; Carol A. Finn, the Group Leader of the USGS Geomagnetism Program; the USGS International Office; and Melody Francisco of the Office of Special Programs and Continuing Education of the Colorado School of Mines. We also thank the student employees that the Geomagnetism Program has had over the years and leading up to the time of the workshop. For preparation of the proceedings, thanks go to Eddie and Tim. And, finally, we thank our sponsors, the USGS, IAGA, and the Colorado School of Mines.

  19. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    NASA Astrophysics Data System (ADS)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  20. Recent Results of the Investigation of a Microfluidic Sampling Chip and Sampling System for Hot Cell Aqueous Processing Streams

    SciTech Connect

    Julia Tripp; Jack Law; Tara Smith

    2013-10-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and microfluidics sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The microfluidic-based robotic sampling system’s mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of microfluidic sampling chips.

  1. Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.

    PubMed

    Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

    2012-11-05

    This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others.

  2. The process of spatial knowledge acquisition in a square and a circular virtual environment.

    PubMed

    Jansen-Osmann, Petra; Heil, Martin

    2008-07-15

    This study investigated the effect of the environmental structure (circular vs. square environment) on spatial knowledge acquisition in a desktop virtual situation in which self-determined movement was allowed with a total of 120 participants: 7-, 8-year-old children; 11, 12-year-old children, and adults. In all measurements of spatial knowledge acquisition an overall developmental performance increase from younger children to adults was found. In contrast to that, the exploration and learning behavior did not differ between adults and children. Furthermore, the environmental structure influencedthenumber of trials needed to learn the two routes used and the distance walked to the determined landmarks. All these tasks were easier in a circular than in a square environment. This influenceofthe environmental structure was absent in the direction estimations task. The advantage of spatial knowledge acquisition in a circular environment in three of four tasks is discussed.

  3. The history of imitation in learning theory: the language acquisition process.

    PubMed

    Kymissis, E; Poulson, C L

    1990-09-01

    The concept of imitation has undergone different analyses in the hands of different learning theorists throughout the history of psychology. From Thorndike's connectionism to Pavlov's classical conditioning, Hull's monistic theory, Mowrer's two-factor theory, and Skinner's operant theory, there have been several divergent accounts of the conditions that produce imitation and the conditions under which imitation itself may facilitate language acquisition. In tracing the roots of the concept of imitation in the history of learning theory, the authors conclude that generalized imitation, as defined and analyzed by operant learning theorists, is a sufficiently robust formulation of learned imitation to facilitate a behavior-analytic account of first-language acquisition.

  4. A Study of the Barriers to Institutionalization of Total Quality Management (TQM) in the Department of Defense Acquisition Process

    DTIC Science & Technology

    1990-12-01

    functions. First, the opportunity to improve the quality of products is the greatest during the early stages of design and production. Attempting to add... quality of products . 2. Characterization by Very Familiar Respondents a. Drives the Acquisition System to ShortTerm Thinking and Planning One... quality of products bought by DoD will increase when the number of suppliers is reduced. Best value contractors who practice both product and process

  5. UK Defence Acquisition Process for NEC: Transaction Governance within an Integrated Project Team

    DTIC Science & Technology

    2009-04-22

    Government of Margaret Thatcher . Of the five largest defence companies in 1979, four were state owned: British Aerospace, British Shipbuilders, Royal...NEC demands a more collaborative, through- life approach to defence acquisition. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...Innovative Systems Engineering), exploring organisational aspects of Through Life Systems Management. Maytorena became a lecturer in Construction Project

  6. ISO 9001: 2008 Quality Assurance Assessment of Defense Acquisition University Processes

    DTIC Science & Technology

    2012-09-27

    ISO 9001 :2008 Quality Management System Requirements. The...overall objective was to perform a quality review, using ISO 9001 :2008 Quality Management System assessment techniques, of Defense Acquisition...57 Introduction Objectives Our overall objective was to perform a quality review, using ISO 9001 :2008 Quality Management

  7. EXD HME MicroCT Data Acquisition, Processing and Data Request Overview

    SciTech Connect

    Seetho, Isaac M.; Brown, William D.; Martz, Jr., Harry E.

    2016-12-06

    This document is a short summary of the steps required for MicroCT evaluation of a specimen. This includes data acquisition through image analysis, for the EXD HME program. Expected outputs for each stage are provided. Data shall be shipped to LLNL as described herein.

  8. Directed Blogging with Community College ESL Students: Its Effects on Awareness of Language Acquisition Processes

    ERIC Educational Resources Information Center

    Johnson, Cathy

    2012-01-01

    English as a Second Language (ESL) students often have problems progressing in their acquisition of the language and frequently do not know how to solve this dilemma. Many of them think of their second language studies as just another school subject that they must pass in order to move on to the next level, so few of them realize the metacognitive…

  9. The Representation and Processing of Familiar Faces in Dyslexia: Differences in Age of Acquisition Effects

    ERIC Educational Resources Information Center

    Smith-Spark, James H.; Moore, Viv

    2009-01-01

    Two under-explored areas of developmental dyslexia research, face naming and age of acquisition (AoA), were investigated. Eighteen dyslexic and 18 non-dyslexic university students named the faces of 50 well-known celebrities, matched for facial distinctiveness and familiarity. Twenty-five of the famous people were learned early in life, while the…

  10. Optionality in Second Language Acquisition: A Generative, Processing-Oriented Account

    ERIC Educational Resources Information Center

    Truscott, John

    2006-01-01

    The simultaneous presence in a learner's grammar of two features that should be mutually exclusive (optionality) typifies second language acquisition. But generative approaches have no good means of accommodating the phenomenon. The paper proposes one approach, based on Truscott and Sharwood Smith's (2004) MOGUL framework. In this framework,…

  11. Input-Based Tasks and the Acquisition of Vocabulary and Grammar: A Process-Product Study

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2012-01-01

    The study reported in this article investigated the use of input-based tasks with young, beginner learners of English as a second language by examining both learning outcomes (i.e. acquisition) and the interactions that resulted from implementing the tasks. The participants were 15 learners, aged six, with no experience of second language (L2)…

  12. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  13. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  14. Rapid parameter optimization of low signal-to-noise samples in NMR spectroscopy using rapid CPMG pulsing during acquisition: application to recycle delays.

    PubMed

    Farooq, Hashim; Courtier-Murias, Denis; Soong, Ronald; Masoom, Hussain; Maas, Werner; Fey, Michael; Kumar, Rajeev; Monette, Martine; Stronks, Henry; Simpson, Myrna J; Simpson, André J

    2013-03-01

    A method is presented that combines Carr-Purcell-Meiboom-Gill (CPMG) during acquisition with either selective or nonselective excitation to produce a considerable intensity enhancement and a simultaneous loss in chemical shift information. A range of parameters can theoretically be optimized very rapidly on the basis of the signal from the entire sample (hard excitation) or spectral subregion (soft excitation) and should prove useful for biological, environmental, and polymer samples that often exhibit highly dispersed and broad spectral profiles. To demonstrate the concept, we focus on the application of our method to T(1) determination, specifically for the slowest relaxing components in a sample, which ultimately determines the optimal recycle delay in quantitative NMR. The traditional inversion recovery (IR) pulse program is combined with a CPMG sequence during acquisition. The slowest relaxing components are selected with a shaped pulse, and then, low-power CPMG echoes are applied during acquisition with intervals shorter than chemical shift evolution (RCPMG) thus producing a single peak with an SNR commensurate with the sum of the signal integrals in the selected region. A traditional (13)C IR experiment is compared with the selective (13)C IR-RCPMG sequence and yields the same T(1) values for samples of lysozyme and riverine dissolved organic matter within error. For lysozyme, the RCPMG approach is ~70 times faster, and in the case of dissolved organic matter is over 600 times faster. This approach can be adapted for the optimization of a host of parameters where chemical shift information is not necessary, such as cross-polarization/mixing times and pulse lengths.

  15. A Flash-ADC data acquisition system developed for a drift chamber array and a digital filter algorithm for signal processing

    NASA Astrophysics Data System (ADS)

    Yi, Han; Lü, Li-Ming; Zhang, Zhao; Cheng, Wen-Jing; Ji, Wei; Huang, Yan; Zhang, Yan; Li, Hong-Jie; Cui, Yin-Ping; Lin, Ming; Wang, Yi-Jie; Duan, Li-Min; Hu, Rong-Jiang; Xiao, Zhi-Gang

    2016-11-01

    A Flash-ADC data acquisition (DAQ) system has been developed for the drift chamber array designed for the External-Target-Experiment at the Cooling Storage Ring at the Heavy Ion Research Facility, Lanzhou. The simplified readout electronics system has been developed using the Flash-ADC modules and the whole waveform in the sampling window is obtained, with which the time and energy information can be deduced with an offline processing. A digital filter algorithm has been developed to discriminate the noise and the useful signal. With the digital filtering process, the signal to noise ratio (SNR) is increased and a better time and energy resolution can be obtained. Supported by National Basic Research Program of China (973) (2015CB856903 and 2014CB845405), partly by National Science Foundation of China (U1332207 and 11375094), and by Tsinghua University Initiative Scientific Research Program

  16. Acquisition, Processing, and Archiving of High-Quality Core Data, North Carolina Outer Banks

    NASA Astrophysics Data System (ADS)

    Brooks, R. W.; Hoffman, C. W.; Farrell, K. M.

    2002-12-01

    Rotosonic drilling technology was used to recover approximately 350 m of core (10 cm diameter) at eight different locations on the Outer Banks of North Carolina as part of a coastal geology cooperative research program. A combination of vibration and rotation of the drill pipe and casing is used to advance the hole. Water is used to wash out the casing, but is not circulated and no cuttings are brought to the surface. This leaves the site relatively undisturbed, so working in municipal areas is not a problem. Drill costs averaged about \\$140/m. Coring runs are 3.3 m (10 ft) long, with each run recovering two 1.65 m (5 ft.) long polycarbonate tubes containing the core sample. In the laboratory, tubes are cut lengthwise with a circular saw and then split by pulling piano wire through the sediment. One half-core is used for sampling; the other half is used to create a detailed visual log and digital image, and is retained as an archive sample. The drilling recovered high-quality lithologic samples in unconsolidated sediments with recovery rates of over 90 percent in most holes. This allowed for thorough, detailed description and stratigraphic analysis, and closely controlled sampling for age dating and geochemical studies. High-resolution (2048 x 1536 pixel) digital images (TIFF format) of the cores are taken in a controlled setting. Lighting, camera settings, and core positioning are carefully monitored to ensure consistency. A tape measure is included in the frame to provide depth reference information in each image. Approximately 36cm of the core is imaged at a time (9 Mb file). To construct composited core images, each 36 cm-long segment is digitally stitched together using a software program written specifically for piecing together panoramic photographs. This process yields a high-resolution (TIFF format; 36 Mb) image showing the full 1.65 m (5 ft.) core tube. The composite image is then saved in JPEG format to reduce the file size to just over 4.5 Mb without

  17. Engineering the Business of Defense Acquisition: An Analysis of Program Office Processes

    DTIC Science & Technology

    2015-04-30

    ååì~ä=^Åèìáëáíáçå= oÉëÉ~êÅÜ=póãéçëáìã= tÉÇåÉëÇ~ó=pÉëëáçåë= sçäìãÉ=f= = Engineering the Business of Defense Acquisition: An Analysis of Program Office...DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Engineering the Business of Defense Acquisition: An Analysis of Program Office...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School,Graduate School of Business & Public Policy,555 Dyer Rd,Monterey,CA,93943 8

  18. Determining the Value of Contractor Performance Assessment Reporting System (CPARS) Narratives for the Acquisition Process

    DTIC Science & Technology

    2014-05-15

    He was commissioned through the Officer Candidate School ( OSC ) program in Pensacola, FL. After OCS, he attended and graduated from the Navy Supply...acquisition team may release a draft RFP to solicit industry feedback on the feasibility of the requirements and to get a feel for industry interest...comment. This ensures the past performance information report card is finalized only after the contractor has the opportunity to provide feedback to the

  19. The health hazard assessment process in support of joint weapon system acquisitions.

    PubMed

    Kluchinsky, Timothy A; Jokel, Charles R; Cambre, John V; Goddard, Donald E; Batts, Robert W

    2013-01-01

    Since 1981, the Army's HHA Program has provided an invaluable service to combat developers and materiel program managers by providing recommendations designed to eliminate or control health hazards associated with materiel and weapon systems. The program has consistently strived to improve its services by providing more meaningful and efficient assistance to the acquisition community. In the uncertain fiscal times ahead, the Army's HHA Program will continue to provide valuable and cost-effective solutions to mitigate the health risks of weapons systems.

  20. Bottleneck Analysis on the DoD Pre-Milestone B Acquisition Processes

    DTIC Science & Technology

    2013-04-01

    Philippines Christopher Auger, Lars Baldus, Brian Yoshimoto, J. Robert Wirthlin, and John Colombi, The Air Force Institute of Technology Published April 1...Auger, Lars Baldus, Brian Yoshimoto, J. Robert Wirthlin, and John Colombi, The Air Force Institute of Technology Software Acquisition Patterns of...Gallup, and Douglas MacKinnon Naval Postgraduate School Capturing Creative Program Management Best Practices Brandon Keller and J. Robert Wirthlin

  1. Sampling from Determinantal Point Processes for Scalable Manifold Learning.

    PubMed

    Wachinger, Christian; Golland, Polina

    2015-01-01

    High computational costs of manifold learning prohibit its application for large datasets. A common strategy to overcome this problem is to perform dimensionality reduction on selected landmarks and to successively embed the entire dataset with the Nyström method. The two main challenges that arise are: (i) the landmarks selected in non-Euclidean geometries must result in a low reconstruction error, (ii) the graph constructed from sparsely sampled landmarks must approximate the manifold well. We propose to sample the landmarks from determinantal distributions on non-Euclidean spaces. Since current determinantal sampling algorithms have the same complexity as those for manifold learning, we present an efficient approximation with linear complexity. Further, we recover the local geometry after the sparsification by assigning each landmark a local covariance matrix, estimated from the original point set. The resulting neighborhood selection .based on the Bhattacharyya distance improves the embedding of sparsely sampled manifolds. Our experiments show a significant performance improvement compared to state-of-the-art landmark selection techniques on synthetic and medical data.

  2. Sampling from Determinantal Point Processes for Scalable Manifold Learning

    PubMed Central

    Golland, Polina

    2015-01-01

    High computational costs of manifold learning prohibit its application for large datasets. A common strategy to overcome this problem is to perform dimensionality reduction on selected landmarks and to successively embed the entire dataset with the Nyström method. The two main challenges that arise are: (i) the landmarks selected in non-Euclidean geometries must result in a low reconstruction error, (ii) the graph constructed from sparsely sampled landmarks must approximate the manifold well. We propose to sample the landmarks from determinantal distributions on non-Euclidean spaces. Since current determinantal sampling algorithms have the same complexity as those for manifold learning, we present an efficient approximation with linear complexity. Further, we recover the local geometry after the sparsification by assigning each landmark a local covariance matrix, estimated from the original point set. The resulting neighborhood selection based on the Bhattacharyya distance improves the embedding of sparsely sampled manifolds. Our experiments show a significant performance improvement compared to state-of-the-art landmark selection techniques on synthetic and medical data. PMID:26221713

  3. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  4. Communication Barriers in Quality Process: Sakarya University Sample

    ERIC Educational Resources Information Center

    Yalcin, Mehmet Ali

    2012-01-01

    Communication has an important role in life and especially in education. Nowadays, lots of people generally use technology for communication. When technology uses in education and other activities, there may be some communication barriers. And also, quality process has an important role in higher education institutes. If a higher education…

  5. Fast nearly ML estimation of Doppler frequency in GNSS signal acquisition process.

    PubMed

    Tang, Xinhua; Falletti, Emanuela; Lo Presti, Letizia

    2013-04-29

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains.

  6. Mechanical Abrasion as a Low Cost Technique for Contamination-Free Sample Acquisition from a Category IVA Clean Platform

    NASA Technical Reports Server (NTRS)

    Dolgin, B.; Yarbrough, C.; Carson, J.; Troy, R.

    2000-01-01

    The proposed Mars Sample Transfer Chain Architecture provides Planetary Protection Officers with clean samples that are required for the eventual release from confinement of the returned Martian samples. At the same time, absolute cleanliness and sterility requirement is not placed of any part of the Lander (including the deep drill), Mars Assent Vehicle (MAV), any part of the Orbiting Sample container (OS), Rover mobility platform, any part of the Minicorer, Robotic arm (including instrument sensors), and most of the caching equipment on the Rover. The removal of the strict requirements in excess of the Category IVa cleanliness (Pathfinder clean) is expected to lead to significant cost savings. The proposed architecture assumes that crosscontamination renders all surfaces in the vicinity of the rover(s) and the lander(s) contaminated. Thus, no accessible surface of Martian rocks and soil is Earth contamination free. As a result of the latter, only subsurface samples (either rock or soil) can be and will be collected for eventual return to Earth. Uncontaminated samples can be collected from a Category IVa clean platform. Both subsurface soil and rock samples can be maintained clean if they are collected by devices that are self-contained and clean and sterile inside only. The top layer of the sample is removed in a manner that does not contaminate the collection tools. Biobarrier (e.g., aluminum foil) covering the moving parts of these devices may be used as the only self removing bio-blanket that is required. The samples never leave the collection tools. The lids are placed on these tools inside the collection device. These single use tools with the lid and the sample inside are brought to Earth in the OS. The lids have to be designed impenetrable to the Earth organisms. The latter is a well established art.

  7. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  8. A dedicated hardware architecture for data acquisition and processing in a time-of-flight emission tomography system (Super PETT)

    SciTech Connect

    Holmes, T.J.; Blaine, G.J.; Fiche, D.C.; Hitchens, R.E.; Snyder, D.L.

    1983-02-01

    The authors present the architecture, implementation and performance aspects of a dedicated processor for use in Super-PETT. The micro-coded machine accepts event data from the acquisition circuitry and constructs in real time any of three pre-image types, including time-of-flight arrays. Pre-images are later backloaded to perform high speed reconstructions. One such processor is assigned to each image slice of the Super-PETT. Event rates and processing times are given for various application modalities.

  9. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  10. Multilevel adaptive process control of acquisition and post-processing of computed radiographic images in picture archiving and communication system environment.

    PubMed

    Zhang, J; Huang, H K

    1998-01-01

    Computed radiography (CR) has become a widely used imaging modality replacing the conventional screen/film procedure in diagnostic radiology. After a latent image is captured in a CR imaging plate, there are seven key processes required before a CR image can be reliably archived and displayed in a picture archiving and communication system (PACS) environment. Human error, computational bottlenecks, software bugs, and CR system errors often crash the CR acquisition and post-processing computers which results in a delay of transmitting CR images for proper viewing at the workstation. In this paper, we present a control theory and a fault tolerance algorithm, as well as their implementation in the PACS environment to circumvent such problems. The software implementation of the control theory and the algorithm is based on the event-driven, multilevel adaptive processing structure. The automated software has been used to provide real-time monitoring and control of CR image acquisition and post-processing in the intensive care unit module of the PACS operation at the University of California, San Francisco. Results demonstrate that the multilevel adaptive process control structure improves CR post-processing time, increases the reliability of the CR images delivery, minimizes user intervention, and speeds up the previously time-consuming quality assurance procedure.

  11. EARLY SYNTACTIC ACQUISITION.

    ERIC Educational Resources Information Center

    KELLEY, K.L.

    THIS PAPER IS A STUDY OF A CHILD'S EARLIEST PRETRANSFORMATIONAL LANGUAGE ACQUISITION PROCESSES. A MODEL IS CONSTRUCTED BASED ON THE ASSUMPTIONS (1) THAT SYNTACTIC ACQUISITION OCCURS THROUGH THE TESTING OF HYPOTHESES REFLECTING THE INITIAL STRUCTURE OF THE ACQUISITION MECHANISM AND THE LANGUAGE DATA TO WHICH THE CHILD IS EXPOSED, AND (2) THAT…

  12. A strong-motion network in Northern Italy (RAIS): data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Augliera, Paolo; Ezio, D'alema; Simone, Marzorati; Marco, Massa

    2010-05-01

    The necessity of a dense network in Northern Italy started from the lack of available data after the occurrence of the 24th November 2004, Ml 5.2, Salò earthquake. Since 2006 many efforts have been made by the INGV (Italian National Institute for Geophysics and Volcanology), department of Milano-Pavia (hereinafter INGV MI-PV), to improve the strong-motion monitoring of the Northern Italy regions. At the end of 2007, the RAIS (Strong-Motion Network in Northern Italy) included 19 stations equipped with Kinemetrics Episensor FBA ES-T coupled with 5 20-bits Lennartz Mars88/MC and 14 24-bits Reftek 130-01 seismic recorders. In this step, we achieved the goal to reduce the average inter-distances between strong-motion stations, installed in the area under study, from about 40 km to 15 km. In this period the GSM-modem connection between the INGV MI-PV acquisition center and the remote stations was used. Starting to 2008, in order to assure real-time recordings, with the aim to integrate RAIS data in the calculation of the Italian ground-shaking maps, the main activity was devoted to update the data acquisition of the RAIS strong-motion network. Moreover a phase that will lead to replace the original recorders with 24-bits GAIA2 systems (directly produced by INGV-CNT laboratory, Rome) has been starting. Today 11 out of the 22 stations are already equipped by GAIA2 and their original GSM-modem acquisition system were already replaced with real-time connections, based on TCP/IP or Wi-Fi links. All real time stations storage data using the MiniSEED format. The management and data exchange are assured by the SEED-Link and Earthworm packages. The metadata dissemination is achieved through the website, where the computed strong motion parameters, together the amplification functions, for each recording station are available for each recorded events. The waveforms, for earthquake with local magnitude higher than 3.0 are now collected in the ITalian ACcelerometric Archive (http://itaca.mi.ingv.it).

  13. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... PHARMACEUTICALS Production and Process Controls § 211.110 Sampling and testing of in-process materials and drug... production process, e.g., at commencement or completion of significant phases or after storage for long... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Sampling and testing of in-process materials...

  14. The Digital Signal Processing Platform for the Low Frequency Aperture Array: Preliminary Results on the Data Acquisition Unit

    NASA Astrophysics Data System (ADS)

    Naldi, Giovanni; Mattana, Andrea; Pastore, Sandro; Alderighi, Monica; Zarb Adami, Kristian; Schillirò, Francesco; Aminaei, Amin; Baker, Jeremy; Belli, Carolina; Comoretto, Gianni; Chiarucci, Simone; Chiello, Riccardo; D’Angelo, Sergio; Dalle Mura, Gabriele; De Marco, Andrea; Halsall, Rob; Magro, Alessio; Monari, Jader; Roberts, Matt; Perini, Federico; Poloni, Marco; Pupillo, Giuseppe; Rusticelli, Simone; Schiaffino, Marco; Zaccaro, Emanuele

    A signal processing hardware platform has been developed for the Low Frequency Aperture Array component of the Square Kilometre Array (SKA). The processing board, called an Analog Digital Unit (ADU), is able to acquire and digitize broadband (up to 500MHz bandwidth) radio-frequency streams from 16 dual polarized antennas, channel the data streams and then combine them flexibly as part of a larger beamforming system. It is envisaged that there will be more than 8000 of these signal processing platforms in the first phase of the SKA, so particular attention has been devoted to ensure the design is low-cost and low-power. This paper describes the main features of the data acquisition unit of such a platform and presents preliminary results characterizing its performance.

  15. The influence of data shape acquisition process and geometric accuracy of the mandible for numerical simulation.

    PubMed

    Relvas, C; Ramos, A; Completo, A; Simões, J A

    2011-08-01

    Computer-aided technologies have allowed new 3D modelling capabilities and engineering analyses based on experimental and numerical simulation. It has enormous potential for product development, such as biomedical instrumentation and implants. However, due to the complex shapes of anatomical structures, the accuracy of these technologies plays an important key role for adequate and accurate finite element analysis (FEA). The objective of this study was to determine the influence of the geometry variability between two digital models of a human model of the mandible. Two different shape acquisition techniques, CT scan and 3D laser scan, were assessed. A total of 130 points were controlled and the deviations between the measured points of the physical and 3D virtual models were assessed. The results of the FEA study showed a relative difference of 20% for the maximum displacement and 10% for the maximum strain between the two geometries.

  16. Testing of the Prototype Mars Drill and Sample Acquisition System in the Mars Analog Site of the Antarctica's Dry Valleys

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Paulsen, G.; McKay, C.; Glass, B. J.; Marinova, M.; Davila, A. F.; Pollard, W. H.; Jackson, A.

    2011-12-01

    We report on the testing of the one meter class prototype Mars drill and cuttings sampling system, called the IceBreaker in the Dry Valleys of Antarctica. The drill consists of a rotary-percussive drill head, a sampling auger with a bit at the end having an integrated temperature sensor, a Z-stage for advancing the auger into the ground, and a sampling station for moving the augered ice shavings or soil cuttings into a sample cup. In November/December of 2010, the IceBreaker drill was tested in the Uni-versity Valley (within the Beacon Valley region of the Antarctic Dry Valleys). University Valley is a good analog to the Northern Polar Regions of Mars because a layer of dry soil lies on top of either ice-cemeted ground or massive ice (depending on the location within the valley). That is exactly what the 2007 Phoenix mission discovered on Mars. The drill demonstrated drilling in ice-cemented ground and in massive ice at the 1-1-100-100 level; that is the drill reached 1 meter in 1 hour with 100 Watts of power and 100 Newton Weight on Bit. This corresponds to an average energy of 100 Whr. At the same time, the bit temperature measured by the bit thermocouple did not exceed more than 10 °C above the formation temperature. The temperature also never exceeded freezing, which minimizes chances of getting stuck and also of altering the materials that are being sampled and analyzed. The samples in the forms of cuttings were acquired every 10 cm intervals into sterile bags. These tests have shown that drilling on Mars, in ice cemented ground with limited power, energy and Weight on Bit, and collecting samples in discrete depth intervals is possible within the given mass, power, and energy levels of a Phoenix-size lander and within the duration of a Phoenix-like mission.

  17. THE BLANCO COSMOLOGY SURVEY: DATA ACQUISITION, PROCESSING, CALIBRATION, QUALITY DIAGNOSTICS, AND DATA RELEASE

    SciTech Connect

    Desai, S.; Mohr, J. J.; Semler, D. R.; Liu, J.; Bazin, G.; Zenteno, A.; Armstrong, R.; Bertin, E.; Allam, S. S.; Buckley-Geer, E. J.; Lin, H.; Tucker, D.; Barkhouse, W. A.; Cooper, M. C.; Hansen, S. M.; High, F. W.; Lin, Y.-T.; Ngeow, C.-C.; Rest, A.; Song, J.

    2012-09-20

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha}, {delta}) = (5 hr, -55 Degree-Sign ) and (23 hr, -55 Degree-Sign ). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4 m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out point-spread function-corrected model-fitting photometry for all detected objects. The median 10{sigma} galaxy (point-source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6), and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 mas. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from the Two Micron All Sky Survey, which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematic floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7%, and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier spread{sub m}odel produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta}z/(1 + z) = 0.054 with an outlier fraction {eta} < 5% to z {approx} 1. We highlight some selected science results to date and provide a full description of the released data products.

  18. The Blanco Cosmology Survey: Data Acquisition, Processing, Calibration, Quality Diagnostics and Data Release

    SciTech Connect

    Desai, S.; Armstrong, R.; Mohr, J.J.; Semler, D.R.; Liu, J.; Bertin, E.; Allam, S.S.; Barkhouse, W.A.; Bazin, G.; Buckley-Geer, E.J.; Cooper, M.C.; /UC, Irvine /Lick Observ. /UC, Santa Cruz

    2012-04-01

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha},{delta})= (5 hr, -55{sup circ} and 23 hr, -55{sup circ}). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out PSF corrected model fitting photometry for all detected objects. The median 10{sigma} galaxy (point source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6) and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 milli-arcsec. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from 2MASS which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematics floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7% and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta} z/(1+z)=0.054 with an outlier fraction {eta}<5% to z{approx}1. We highlight some selected science results to date and provide a full description of the released data products.

  19. Simulation-based Decision Support for Acquisition Policy and Process Design: The Effect of System and Enterprise Characteristics on Acquisition Outcomes

    DTIC Science & Technology

    2009-04-22

    Research Policy , 30(7), 1019-1039. Ford, D.N., & Dillard, J.T. (2008). Modeling the integration of open systems and evolutionary acquisition in DoD...manufacturing firm. Research Policy 24(3), 419- 440. Ulrich, K., & Tung, K. (1991). Fundamentals of product modularity. In Issues in Design/Manufacture

  20. Phonological process analysis from spontaneous speech: the influence of sample size.

    PubMed

    Crary, M A

    1983-03-01

    Phonological process analysis is becoming a popular technique for the evaluation of unintelligible children and adults. Spontaneous speech sampling procedures have been advocated as a representative sampling base for phonological process analysis; however, little research has been reported detailing the parameters of spontaneous samples in reference to this assessment technique. The purpose of the present study was to evaluate the influence of increasing sample size on phonological process analyses from spontaneous speech. Results clearly indicated that samples of 50 words provided descriptive information similar to samples of 100 words. Additional studies are called for to investigate other variables that might influence the results of spontaneous speech analysis.

  1. Microwave-assisted tissue processing for same-day EM-diagnosis of potential bioterrorism and clinical samples.

    PubMed

    Schroeder, Josef A; Gelderblom, Hans R; Hauroeder, Baerbel; Schmetz, Christel; Milios, Jim; Hofstaedter, Ferdinand

    2006-01-01

    , microwave technology facilitates a significant reduction in sample processing time from days to hours without any loss in ultrastructural details. Microwave-assisted processing could, therefore, be a substantial benefit for the routine electron microscopic diagnostic workload. Due to its speed and robust performance it could be applied wherever a rapid electron microscopy diagnosis is required, e.g., if bioterrorism or emerging agents are suspected. Combining microwave technology with digital image acquisition, the 1-day diagnosis based on ultrathin section electron microscopy will become possible, with crucial or interesting findings being consulted or shared worldwide with experts using modern telemicroscopy tools via Internet.

  2. A multi-threshold sampling method for TOF PET signal processing

    SciTech Connect

    Kim, Heejong; Kao, Chien-Min; Xie, Q.; Chen, Chin-Tu; Zhou, L.; Tang, F.; Frisch, Henry; Moses, William W.; Choong, Woon-Seng

    2009-02-02

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multithreshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to 8 threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25 x 6.25 x 25mm{sup 3} LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an {approx}18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an {approx}9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain {approx}300 ps coincidence timing resolution, {approx}14% energy resolution at 511 keV, and {approx}5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  3. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  4. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  5. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    ERIC Educational Resources Information Center

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  6. Tailoring the Acquisition Process in the U.S. Department of Defense

    DTIC Science & Technology

    2015-01-01

    to customize regulatory- based reviews, processes, and information requirements to accommo- date the unique characteristics of a program while still... customize regulatory-based reviews, processes, and information requirements to accommodate the unique characteristics of a program while still...programs are not all the same, and policy allows (even encourages) PMs to customize regulatory- based reviews, processes, and information requirements to

  7. Association of Campylobacter spp. levels between chicken grow-out environmental samples and processed carcasses.

    PubMed

    Schroeder, Matthew W; Eifert, Joseph D; Ponder, Monica A; Schmale, David G

    2014-03-01

    Campylobacter spp. have been isolated from live poultry, production environments, processing facilities, and raw poultry products. Environmental sampling in a poultry grow-out house, combined with carcass rinse sampling from the same flock, may provide a relative relationship between pre- and postharvest Campylobacter contamination. Air samples, fecal/litter samples, and feed/drink line samples were collected from 4 commercial chicken grow-out houses in western Virginia between September 2011 and January 2012. Birds from each sampled house were the first flock slaughtered the following day and were then sampled by postchill carcass rinses. Campylobacter, from postenrichment samples, was detected in 27% (32/120) of house environmental samples and 37.5% (45/120) of carcass rinse samples. All environmental sample types from each house included at least one positive sample except the house 2 air samples. The sponge sample method was found to have a significantly higher (P < 0.05) proportion of Campylobacter-positive samples (45%) than the fecal/litter samples (20%) and air samples (15%) when sample types of all the houses were compared. The proportion positive for the fecal/litter samples postenrichment, for each flock, had the highest correlation (0.85) to the proportion of positive carcass rinse samples for each flock. Environmental samples from house 1 and associated carcass rinses accounted for the largest number of Campylobacter positives (29/60). The fewest number of Campylobacter positives, based on both house environmental (4/30) and carcass rinse samples (8/30), was detected from flock B. The results of this study suggest that environmental sampling in a poultry grow-out house, combined with carcass rinse sampling from the same flock, have the potential to provide an indication of Campylobacter contamination and transmission. Campylobacter qualitative levels from house and processing plant samples may enable the scheduled processing of flocks with lower

  8. Robotic Arm Manipulator Using Active Control for Sample Acquisition and Transfer, and Passive Mode for Surface Compliance

    NASA Technical Reports Server (NTRS)

    Liu, Jun; Underhill, Michael L.; Trease, Brian P.; Lindemann, Randel A.

    2010-01-01

    A robotic arm that consists of three joints with four degrees of freedom (DOF) has been developed. It can carry an end-effector to acquire and transfer samples by using active control and comply with surface topology in a passive mode during a brief surface contact. The three joints are arranged in such a way that one joint of two DOFs is located at the shoulder, one joint of one DOF is located at the elbow, and one joint of one DOF is located at the wrist. Operationally, three DOFs are moved in the same plane, and the remaining one on the shoulder is moved perpendicular to the other three for better compliance with ground surface and more flexibility of sample handling. Three out of four joints are backdriveable, making the mechanism less complex and more cost effective

  9. The Approach to Sample Acquisition and Its Impact on the Derived Human Fecal Microbiome and VOC Metabolome

    PubMed Central

    Couch, Robin D.; Navarro, Karl; Sikaroodi, Masoumeh; Gillevet, Pat; Forsyth, Christopher B.; Mutlu, Ece; Engen, Phillip A.; Keshavarzian, Ali

    2013-01-01

    Recent studies have illustrated the importance of the microbiota in maintaining a healthy state, as well as promoting disease states. The intestinal microbiota exerts its effects primarily through its metabolites, and metabolomics investigations have begun to evaluate the diagnostic and health implications of volatile organic compounds (VOCs) isolated from human feces, enabled by specialized sampling methods such as headspace solid-phase microextraction (hSPME). The approach to stool sample collection is an important consideration that could potentially introduce bias and affect the outcome of a fecal metagenomic and metabolomic investigation. To address this concern, a comparison of endoscopically collected (in vivo) and home collected (ex vivo) fecal samples was performed, revealing slight variability in the derived microbiomes. In contrast, the VOC metabolomes differ widely between the home collected and endoscopy collected samples. Additionally, as the VOC extraction profile is hyperbolic, with short extraction durations more vulnerable to variation than extractions continued to equilibrium, a second goal of our investigation was to ascertain if hSPME-based fecal metabolomics studies might be biased by the extraction duration employed. As anticipated, prolonged extraction (18 hours) results in the identification of considerably more metabolites than short (20 minute) extractions. A comparison of the metabolomes reveals several analytes deemed unique to a cohort with the 20 minute extraction, but found common to both cohorts when the VOC extraction was performed for 18 hours. Moreover, numerous analytes perceived to have significant fold change with a 20 minute extraction were found insignificant in fold change with the prolonged extraction, underscoring the potential for bias associated with a 20 minute hSPME. PMID:24260553

  10. Acquisition process of typing skill using hierarchical materials in the Japanese language.

    PubMed

    Ashitaka, Yuki; Shimada, Hiroyuki

    2014-08-01

    In the present study, using a new keyboard layout with only eight keys, we conducted typing training for unskilled typists. In this task, Japanese college students received training in typing words consisting of a pair of hiragana characters with four keystrokes, using the alphabetic input method, while keeping the association between the keys and typists' finger movements; the task was constructed so that chunking was readily available. We manipulated the association between the hiragana characters and alphabet letters (hierarchical materials: overlapped and nonoverlapped mappings). Our alphabet letter materials corresponded to the regular order within each hiragana word (within the four letters, the first and third referred to consonants, and the second and fourth referred to vowels). Only the interkeystroke intervals involved in the initiation of typing vowel letters showed an overlapping effect, which revealed that the effect was markedly large only during the early period of skill development (the effect for the overlapped mapping being larger than that for the nonoverlapped mapping), but that it had diminished by the time of late training. Conversely, the response time and the third interkeystroke interval, which are both involved in the latency of typing a consonant letter, did not reveal an overlapped effect, suggesting that chunking might be useful with hiragana characters rather than hiragana words. These results are discussed in terms of the fan effect and skill acquisition. Furthermore, we discuss whether there is a need for further research on unskilled and skilled Japanese typists.

  11. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach.

  12. Lexical Processing and Organization in Bilingual First Language Acquisition: Guiding Future Research

    PubMed Central

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-01-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between two languages in the early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. PMID:26866430

  13. Lexical processing and organization in bilingual first language acquisition: Guiding future research.

    PubMed

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-06-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between 2 languages in early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. (PsycINFO Database Record

  14. QC of sampling processes- a first overview: from field to test portion.

    PubMed

    Esbensen, Kim H; Ramsey, Charles A

    2015-01-01

    Quality control (QC) is a systematic approach for estimating and minimizing significant error contributions to the measurement uncertainty from the full sampling and analysis process. Many types of QC measures can be implemented; the three dealt with here are primary sampling reproducibility, sample processing reproducibility, and contamination. Sampling processes can be subject to QC by applying a replication experiment, used either from the top by replication of the entire sampling/ preparation/analysis process, or in a hierarchical fashion successively at each subsequent sampling stage. The analytical repeatability is necessarily always included in either alternative. The replication experiment results in a quality index, the Relative Sampling Variability, which is used to assess the total error associated with the full field-to-analysis pathway. Contamination can occur at essentially all locations in the sampling regimen in the food/feed realm, affecting sample containers, sampling tools, sample processing equipment, environmental conditions, and sampling personnel. QC events to determine contamination should always be included where appropriate, but is of most concern for low concentration and/or volatile analytes. It is also of key importance in the development of new sampling protocols or carried-over protocols intended for use on new types of materials/lots than the ones for which they were originally developed. We here establish a first practical framework for QC as applied to the sampling context.

  15. An Exploratory Analysis of the U.S. System of Major Defense Acquisition Utilizing the CLIOS Process

    DTIC Science & Technology

    2009-09-01

    SPENDING COUNTRIES.............................................20 1. The United States’ Defense Acquisition System..............................21 2...other top military spending countries. It will end with a review of the major defense acquisition literature. This literature review will focus on...Sector Survey on Cost Control, to examine the entire government, including defense acquisition, looking for ways to avoid wasteful public spending . In

  16. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  17. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  18. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  19. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  20. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  1. Data acquisition with Masscomp

    SciTech Connect

    Collins, A.J.

    1988-12-31

    Applications and products for data acquisition and control are abundant. Systems and boards for Apple or IBM products collect, store, and manipulate data up to rates in the 10`s of thousands. These systems may suit your application; if so, it would be good for you to obtain one of these systems. However, if you need speed in the hundreds of thousands of samples per second and you want to store, process, and display data in real time, data acquisition becomes much more complex. Operating system code has to be sufficient to handle the load. A company known as Massachusetts Computer Corporation has modified UNIX operating system code to allow real time data acquisition and control. They call this operating system Real Time Unix, or RTU. They have built a family of computer systems around this operating system with specialized hardware to handle multiple processes and quick communications, which a real time operating system needs to function. This paper covers the basics of an application using a Masscomp 5520 computer. The application is for the KYLE Project Cold Tests in SRL. KYLE is a classified weapons program. The data flow from source to Masscomp, the generic features of Masscomp systems, and the specifics of the Masscomp computer related to this application will be presented.

  2. Apollo experience report: Processing of lunar samples in a sterile nitrogen atmosphere

    NASA Technical Reports Server (NTRS)

    Mcpherson, T. M.

    1972-01-01

    A sterile nitrogen atmosphere processing cabinet line was installed in the Lunar Receiving Laboratory to process returned lunar samples with minimum organic contamination. Design and operation of the cabinet line were complicated by the requirement for biological sterilization and isolation, which necessitated extensive filtration, leak-checking, and system sterilization before use. Industrial techniques were applied to lunar sample processing to meet requirements for time-critical experiments while handling a large flow of samples.

  3. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    ERIC Educational Resources Information Center

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2013-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with…

  4. Defense Health Care: Acquisition Process for TRICARE’s Third Generation of Managed Care Support Contracts

    DTIC Science & Technology

    2014-03-01

    Selection Team Roles and Responsibilities for TRICARE’s Contract Award Process 11 Figure 3: Timeline of Bid Protest Events for TRICARE’s North Region...14-195 TRICARE Managed Care Support Contracts Figure 2: Source Selection Team Roles and Responsibilities for TRICARE’s Contract Award Process

  5. Acquisition Reform: DOD Should Streamline Its Decision-Making Process for Weapon Systems to Reduce Inefficiencies

    DTIC Science & Technology

    2015-02-01

    a more streamlined milestone decision process for some classified programs. Commercial companies we examined—Boeing, Caterpillar , Cummins, Honda...two months to complete the milestone decision process. Commercial companies we examined—Boeing, Caterpillar , Cummins, Honda, and Motorola... Caterpillar Inc. ( Caterpillar ), a leading manufacturer of construction and mining equipment, diesel and natural gas engines, and industrial gas

  6. FTMP data acquisition environment

    NASA Technical Reports Server (NTRS)

    Padilla, Peter A.

    1988-01-01

    The Fault-Tolerant Multi-Processing (FTMP) test-bed data acquisition environment is described. The performance of two data acquisition devices available in the test environment are estimated and compared. These estimated data rates are used as measures of the devices' capabilities. A new data acquisition device was developed and added to the FTMP environment. This path increases the data rate available by approximately a factor of 8, to 379 KW/S, while simplifying the experiment development process.

  7. Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes

    DTIC Science & Technology

    2010-08-01

    Evaluation Plans ( BEPs ) Establishes how the MoD approved, procurement action will proceed – Dates for requesting and receiving bids – Evaluation period...Proposed Bid and Evaluation Plan ( BEP ) Next chart highlights the intended process and products Flag Button BLOCK 6 8 Intended Process and Products...Funding Level and Source  Circular of Requirements (COR) with Key Performance Parameters (KPPs)  Bid and Evaluation Plan ( BEP ) A B C D E D1 D2 D3

  8. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  9. Learning from Each Other: Comparative Analysis of the Acquisition Process of Lithuania and U.S.

    DTIC Science & Technology

    2006-06-01

    steps in the process are sometimes inconsistent. Interviewed personnel also complain that the market research and analysis of alternatives is often...and analyses of alternatives are conducted. In the U.S., market research and analysis of alternatives is a more in-depth and structured process...defined. Additionally, emphasis should be placed on two key steps: market research and analysis of alternatives. • Technical specifications: The most

  10. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  11. Mass spectrometry in plant metabolomics strategies: from analytical platforms to data acquisition and processing.

    PubMed

    Ernst, Madeleine; Silva, Denise Brentan; Silva, Ricardo Roberto; Vêncio, Ricardo Z N; Lopes, Norberto Peporine

    2014-06-01

    Covering: up to 2013. Plant metabolomics is a relatively recent research field that has gained increasing interest in the past few years. Up to the present day numerous review articles and guide books on the subject have been published. This review article focuses on the current applications and limitations of the modern mass spectrometry techniques, especially in combination with electrospray ionisation (ESI), an ionisation method which is most commonly applied in metabolomics studies. As a possible alternative to ESI, perspectives on matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) in metabolomics studies are introduced, a method which still is not widespread in the field. In metabolomics studies the results must always be interpreted in the context of the applied sampling procedures as well as data analysis. Different sampling strategies are introduced and the importance of data analysis is illustrated in the example of metabolic network modelling.

  12. Abdominal 4D Flow MR Imaging in a Breath Hold: Combination of Spiral Sampling and Dynamic Compressed Sensing for Highly Accelerated Acquisition

    PubMed Central

    Knight-Greenfield, Ashley; Jajamovich, Guido; Besa, Cecilia; Cui, Yong; Stalder, Aurélien; Markl, Michael; Taouli, Bachir

    2015-01-01

    Purpose To develop a highly accelerated phase-contrast cardiac-gated volume flow measurement (four-dimensional [4D] flow) magnetic resonance (MR) imaging technique based on spiral sampling and dynamic compressed sensing and to compare this technique with established phase-contrast imaging techniques for the quantification of blood flow in abdominal vessels. Materials and Methods This single-center prospective study was compliant with HIPAA and approved by the institutional review board. Ten subjects (nine men, one woman; mean age, 51 years; age range, 30–70 years) were enrolled. Seven patients had liver disease. Written informed consent was obtained from all participants. Two 4D flow acquisitions were performed in each subject, one with use of Cartesian sampling with respiratory tracking and the other with use of spiral sampling and a breath hold. Cartesian two-dimensional (2D) cine phase-contrast images were also acquired in the portal vein. Two observers independently assessed vessel conspicuity on phase-contrast three-dimensional angiograms. Quantitative flow parameters were measured by two independent observers in major abdominal vessels. Intertechnique concordance was quantified by using Bland-Altman and logistic regression analyses. Results There was moderate to substantial agreement in vessel conspicuity between 4D flow acquisitions in arteries and veins (κ = 0.71 and 0.61, respectively, for observer 1; κ = 0.71 and 0.44 for observer 2), whereas more artifacts were observed with spiral 4D flow (κ = 0.30 and 0.20). Quantitative measurements in abdominal vessels showed good equivalence between spiral and Cartesian 4D flow techniques (lower bound of the 95% confidence interval: 63%, 77%, 60%, and 64% for flow, area, average velocity, and peak velocity, respectively). For portal venous flow, spiral 4D flow was in better agreement with 2D cine phase-contrast flow (95% limits of agreement: −8.8 and 9.3 mL/sec, respectively) than was Cartesian 4D flow (95

  13. Second-first language acquisition: analysis of expressive language skills in a sample of girls adopted from China.

    PubMed

    Tan, Tony Xing; Loker, Troy; Dedrick, Robert F; Marfo, Kofi

    2012-03-01

    In this study we investigated adopted Chinese girls' expressive English language outcomes in relation to their age at adoption, chronological age, length of exposure to English and developmental risk status at the time of adoption. Vocabulary and phrase utterance data on 318 girls were collected from the adoptive mothers using the Language Development Survey (LDS) (Achenbach & Rescorla, 2000). The girls, aged 18-35 months (M=26·2 months, SD=4·9 months), were adopted at ages ranging from 6·8 to 24 months (M=12·6 months, SD=3·1 months), and had been exposed to English for periods ranging from 1·6 to 27·6 months (M=13·7, SD=5·7). Findings suggest that vocabulary and mean length of phrase scores were negatively correlated with age at adoption but positively correlated with chronological age and length of exposure to English. Developmental risk status at the time of adoption was not correlated with language outcomes. The gap between their expressive language and that of same-age girls from the US normative sample was wider for children aged 18-23 months but was closed for children aged 30-35 months. About 16% of the children met the LDS criteria for delays in vocabulary and 17% met the LDS criteria for delays in mean length of phrase. Speech/language interventions were received by 33·3% of the children with delays in vocabulary and 25% with delays in phrase.

  14. How human resource organization can enhance space information acquisition and processing: the experience of the VENESAT-1 ground segment

    NASA Astrophysics Data System (ADS)

    Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica

    2009-12-01

    Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM

  15. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds.

  16. On the influence of typicality and age of acquisition on semantic processing: Diverging evidence from behavioural and ERP responses.

    PubMed

    Räling, Romy; Holzgrefe-Lang, Julia; Schröder, Astrid; Wartenburger, Isabell

    2015-08-01

    Various behavioural studies show that semantic typicality (TYP) and age of acquisition (AOA) of a specific word influence processing time and accuracy during the performance of lexical-semantic tasks. This study examines the influence of TYP and AOA on semantic processing at behavioural (response times and accuracy data) and electrophysiological levels using an auditory category-member-verification task. Reaction time data reveal independent TYP and AOA effects, while in the accuracy data and the event-related potentials predominantly effects of TYP can be found. The present study thus confirms previous findings and extends evidence found in the visual modality to the auditory modality. A modality-independent influence on semantic word processing is manifested. However, with regard to the influence of AOA, the diverging results raise questions on the origin of AOA effects as well as on the interpretation of offline and online data. Hence, results will be discussed against the background of recent theories on N400 correlates in semantic processing. In addition, an argument in favour of a complementary use of research techniques will be made.

  17. Intonational Phrase Structure Processing at Different Stages of Syntax Acquisition: ERP Studies in 2-, 3-, and 6-Year-Old Children

    ERIC Educational Resources Information Center

    Mannel, Claudia; Friederici, Angela D.

    2011-01-01

    This study explored the electrophysiology underlying intonational phrase processing at different stages of syntax acquisition. Developmental studies suggest that children's syntactic skills advance significantly between 2 and 3 years of age. Here, children of three age groups were tested on phrase-level prosodic processing before and after this…

  18. TU-D-BRB-01: Dual-Energy CT: Techniques in Acquisition and Image Processing.

    PubMed

    Pelc, N

    2016-06-01

    Dual-energy CT technology is becoming increasingly available to the medical imaging community. In addition, several models of CT simulators sold for use in radiation therapy departments now feature dual-energy technology. The images provided by dual-energy CT scanners add new information to the radiation treatment planning process; multiple spectral components can be used to separate and identify material composition as well as generate virtual monoenergetic images. In turn, this information could be used to investigate pathologic processes, separate the properties of contrast agents from soft tissues, assess tissue response to therapy, and other applications of therapeutic interest. Additionally, the decomposition of materials in images could directly integrate with and impact the accuracy of dose calculation algorithms. This symposium will explore methods of generating dual-energy CT images, spectral and image analysis algorithms, current and future applications of interest in oncologic imaging, and unique considerations when using dualenergy CT images in the radiation treatment planning process.

  19. Fast multi-dimensional NMR acquisition and processing using the sparse FFT.

    PubMed

    Hassanieh, Haitham; Mayzel, Maxim; Shi, Lixin; Katabi, Dina; Orekhov, Vladislav Yu

    2015-09-01

    Increasing the dimensionality of NMR experiments strongly enhances the spectral resolution and provides invaluable direct information about atomic interactions. However, the price tag is high: long measurement times and heavy requirements on the computation power and data storage. We introduce sparse fast Fourier transform as a new method of NMR signal collection and processing, which is capable of reconstructing high quality spectra of large size and dimensionality with short measurement times, faster computations than the fast Fourier transform, and minimal storage for processing and handling of sparse spectra. The new algorithm is described and demonstrated for a 4D BEST-HNCOCA spectrum.

  20. Human resource processes and the role of the human resources function during mergers and acquisitions in the electricity industry

    NASA Astrophysics Data System (ADS)

    Dass, Ted K.

    Mergers and acquisitions (M&A) have been a popular strategy for organizations to consolidate and grow for more than a century. However, research in this field indicates that M&A are more likely to fail than succeed, with failure rates estimated to be as high as 75%. People-related issues have been identified as important causes for the high failure rate, but these issues are largely neglected until after the deal is closed. One explanation for this neglect is the low involvement of human resource (HR) professionals and the HR function during the M&A process. The strategic HR management literature suggests that a larger role for HR professionals in the M&A process would enable organizations to identify potential problems early and devise appropriate solutions. However, empirical research from an HR perspective has been scarce in this area. This dissertation examines the role of the HR function and the HR processes followed in organizations during M&A. Employing a case-study research design, this study examines M&A undertaken by two large organizations in the electricity industry through the lens of a "process" perspective. Based on converging evidence, the case studies address three sets of related issues: (1) how do organizations undertake and manage M&A; (2) what is the extent of HR involvement in M&A and what role does it play in the M&A process; and (3) what factors explain HR involvement in the M&A process and, more generally, in the formulation of corporate goals and strategies. Results reveal the complexity of issues faced by organizations in undertaking M&A, the variety of roles played by HR professionals, and the importance of several key contextual factors---internal and external to the organization---that influence HR involvement in the M&A process. Further, several implications for practice and future research are explored.

  1. An architecture for real time data acquisition and online signal processing for high throughput tandem mass spectrometry

    SciTech Connect

    Shah, Anuj R.; Jaitly, Navdeep; Zuljevic, Nino; Monroe, Matthew E.; Liyu, Andrei V.; Polpitiya, Ashoka D.; Adkins, Joshua N.; Belov, Mikhail E.; Anderson, Gordon A.; Smith, Richard D.; Gorton, Ian

    2010-12-09

    Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through “intelligent,” online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving “intelligent,” online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an “intelligent” online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.

  2. Confusing similar words: ERP correlates of lexical-semantic processing in first language attrition and late second language acquisition.

    PubMed

    Kasparian, Kristina; Steinhauer, Karsten

    2016-12-01

    First language (L1) attrition is a socio-linguistic circumstance where second language (L2) learning coincides with changes in exposure and use of the native-L1. Attriters often report experiencing a decline in automaticity or proficiency in their L1 after a prolonged period in the L2 environment, while their L2 proficiency continues to strengthen. Investigating the neurocognitive correlates of attrition alongside those of late L2 acquisition addresses the question of whether the brain mechanisms underlying both L1 and L2 processing are strongly determined by proficiency, irrespective of whether the language was acquired from birth or in adulthood. Using event-related-potentials (ERPs), we examined lexical-semantic processing in Italian L1 attriters, compared to adult Italian L2 learners and to Italian monolingual native speakers. We contrasted the processing of classical lexical-semantic violations (Mismatch condition) with sentences that were equally semantically implausible but arguably trickier, as the target-noun was "swapped" with an orthographic neighbor that differed only in its final vowel and gender-marking morpheme (e.g., cappello (hat) vs. cappella (chapel)). Our aim was to determine whether sentences with such "confusable nouns" (Swap condition) would be processed as semantically correct by late L2 learners and L1 attriters, especially for those individuals with lower Italian proficiency scores. We found that lower-proficiency Italian speakers did not show significant N400 effects for Swap violations relative to correct sentences, regardless of whether Italian was the L1 or the L2. Crucially, N400 response profiles followed a continuum of "nativelikeness" predicted by Italian proficiency scores - high-proficiency attriters and high-proficiency Italian learners were indistinguishable from native controls, whereas attriters and L2 learners in the lower-proficiency range showed significantly reduced N400 effects for "Swap" errors. Importantly, attriters

  3. Case Studies of the Air Force Aerospace Ground Equipment (AGE) acquisition Management Process

    DTIC Science & Technology

    1975-12-01

    the methods for evaluating alternatives. A previous LMI study (LMI Task 72-1 Rev.) was undertaken at the request of the Air Force during 1972. That...developed for the sample. The sets of characteristics chosen include various functional types, levels of use, methods of procurement including Air... methods of procurement of the 76 items selected. >’ - ’•^•"^M’"iiirtnri,B’r’iäjiriMirlM ^ji^frito^»..*-,.^.«*).,.^.;.,. . ... ,.,..^.-^^;.- .: .fa

  4. The Effect of Processing Instruction and Dictogloss Tasks on Acquisition of the English Passive Voice

    ERIC Educational Resources Information Center

    Qin, Jingjing

    2008-01-01

    This study was intended to compare processing instruction (VanPatten, 1993, 1996, 2000), an input-based focus on form technique, to dictogloss tasks, an output-oriented focus-on-form type of instruction to assess their effects in helping beginning-EFL (English as a Foreign Language) learners acquire the simple English passive voice. Two intact…

  5. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    ERIC Educational Resources Information Center

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  6. Production and Processing Asymmetries in the Acquisition of Tense Morphology by Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Chondrogianni, Vasiliki; Marinis, Theodoros

    2012-01-01

    This study investigates the production and online processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty-nine six- to nine-year-old L2 children and twenty-eight typically developing age-matched monolingual (L1) children were administered the production…

  7. The RFP Process: Effective Management of the Acquisition of Library Materials.

    ERIC Educational Resources Information Center

    Wilkinson, Frances C.; Thorson, Connie Capers

    Many librarians view procurement, with its myriad forms, procedures, and other organizational requirements, as a tedious or daunting challenge. This book simplifies the process, showing librarians how to successfully prepare a Request for Proposal (RFP) and make informed decisions when determining which vendors to use for purchasing library…

  8. Using Eye-Tracking to Investigate Topics in L2 Acquisition and L2 Processing

    ERIC Educational Resources Information Center

    Roberts, Leah; Siyanova-Chanturia, Anna

    2013-01-01

    Second language (L2) researchers are becoming more interested in both L2 learners' knowledge of the target language and how that knowledge is put to use during real-time language processing. Researchers are therefore beginning to see the importance of combining traditional L2 research methods with those that capture the moment-by-moment…

  9. Analyzing Preschoolers' Overgeneralizations of Object Labeling in the Process of Mother-Tongue Acquisition in Turkey

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2006-01-01

    Language, as is known, is acquired under certain conditions: rapid and sequential brain maturation and cognitive development, the need to exchange information and to control others' actions, and an exposure to appropriate speech input. This research aims at analyzing preschoolers' overgeneralizations of the object labeling process in different…

  10. Transitioning Science and Technology into Acquisition Programs: Assessing One Government Laboratorys Processes

    DTIC Science & Technology

    2015-12-01

    transition process between Armament Research , Development and Engineering Center (ARDEC) and its partnered program offices in transitioning technology ...following recommendations should be implemented by other research and development (R&D) organizations to foster proper technology transition: endorsement...and introduction of technology transition agreements. Research also indicated that in order for ARDEC to continue to improve its technology

  11. Modeling Space Launch Process Delays to Improve Space Vehicle Acquisition Timelines

    DTIC Science & Technology

    2013-06-01

    evolution of the space launch process over the past 15 years is the Space Launch Vehicle Broad Area Review ( SLV BAR). The SLV BAR, led by Gen. Larry...to 9 in 51 launches, a 100% increase (US Air Force, 1999). The SLV BAR began a period of intense scrutiny related to launch vehicle mission

  12. Cognitive Processes and Learner Strategies in the Acquisition of Motor Skills

    DTIC Science & Technology

    1978-12-01

    Keiso . Fi’ekany, 1978). T7us, it is imperative that explicit procedures be designed to control th2 allocation of attention to feecback cues in...dissertation, Teachers College, Colu.bia University, 1974. Nacson, J., Jaeger, M., & Gentile, A. M. Organizational processes in short-term memory. In I. D

  13. The Effectiveness of Processing Instruction in L2 Grammar Acquisition: A Narrative Review

    ERIC Educational Resources Information Center

    Dekeyser, Robert; Botana, Goretti Prieto

    2015-01-01

    The past two decades have seen ample debate about processing instruction (PI) and its various components. In this article, we first describe what PI consists of and then address three questions: about the role of explicit information (EI) in PI, the difference between PI and teaching that incorporates production-based (PB) practice, and various…

  14. NREL Develops Accelerated Sample Activation Process for Hydrogen Storage Materials (Fact Sheet)

    SciTech Connect

    Not Available

    2010-12-01

    This fact sheet describes NREL's accomplishments in developing a new sample activation process that reduces the time to prepare samples for measurement of hydrogen storage from several days to five minutes and provides more uniform samples. Work was performed by NREL's Chemical and Materials Science Center.

  15. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  16. Minimisation of instrumental noise in the acquisition of FT-NIR spectra of bread wheat using experimental design and signal processing techniques.

    PubMed

    Foca, G; Ferrari, C; Sinelli, N; Mariotti, M; Lucisano, M; Caramanico, R; Ulrici, A

    2011-02-01

    Spectral resolution (R) and number of repeated scans (S) have a significant effect on the S/N ratio of Fourier transform-near infrared (FT-NIR) spectra, but the optimal values of these two parameters have to be determined empirically for a specific problem, considering separately both the nature of the analysed matrix and the specific instrumental setup. To achieve this aim, the instrumental noise of replicated FT-NIR spectra of wheat samples was modelled as a function of R and S by means of the Doehlert design. The noise amounts in correspondence to different experimental conditions were estimated by analysing the variance signals derived from replicate measurements with two different signal processing tools, Savitzky-Golay (SG) filtering and fast wavelet transform (FWT), in order to separate the "pure" instrumental noise from other variability sources, which are essentially connected to sample inhomogeneity. Results confirmed that R and S values leading to minimum instrumental noise can vary considerably depending on the type of analysed food matrix and on the different instrumental setups, and helped in the selection of the optimal measuring conditions for the subsequent acquisition of a wide spectral dataset.

  17. Processing Temporal Constraints and Some Implications for the Investigation of Second Language Sentence Processing and Acquisition. Commentary on Baggio

    ERIC Educational Resources Information Center

    Roberts, Leah

    2008-01-01

    Baggio presents the results of an event-related potential (ERP) study in which he examines the processing consequences of reading tense violations such as *"Afgelopen zondag lakt Vincent de kozijnen van zijn landhuis" (*"Last Sunday Vincent paints the window-frames of his country house"). The violation is arguably caused by a mismatch between the…

  18. Success or Failure of Automated Data Processing Systems in Physicians' Offices after System Acquisition

    PubMed Central

    Dahm, Lisa L.

    1983-01-01

    Although many sources exist for gleaning information relative to acquiring a data processing system, less material is available on the subject of what the purchaser may expect and must do following the sale. The ingredients for successfully automating a medical practice include: a good plan for the conversion and on-going use of the automated system; proper training initially and plans for future training should the need arise; proper physical facilities; and a positive and cooperative attitude.

  19. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    DTIC Science & Technology

    2014-03-27

    simulation model and observe system level impacts, before implementing the policy in reality, could be useful to DoD leadership. ERAM is not viewed...consequences. Another method is to utilize modeling and simulation . “A simulation is an abstraction of an operation in a real-world process or system ...over time” (Banks, 2005: 3). Coupled with the computational capabilities of computers, modeling and simulation enables system analysis difficult, if

  20. Incorporating Operator Workload Issues and Concerns into the System Acquisition Process: A Pamphlet for Army Managers

    DTIC Science & Technology

    1990-09-01

    If OWL is not considered early and continuously during the design, development , and evaluation of a system, the Army will not know if the system makes...information relevant to soldier performance and reliability into the system development process. A4 < UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGOE...Research Laboratory of the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) supports the Army with research and development on

  1. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  2. Determining the Value of Contractor Performance Assessment Reporting System (CPARS) Narratives for the Acquisition Process

    DTIC Science & Technology

    2014-06-01

    sweep floors daily, mop floors with soap and water one time per week, wax floors one time per quarter, remove trash from garbage bins every Thursday...regarding the process aim to improve the CPARS report card product , which should lead to greater and more effective utilization of the CPARS system...2014, Part 2.101b) 10 The basic definition is that the government has a need and fulfills that need by purchasing a product or service through a

  3. Assessment of language acquisition.

    PubMed

    de Villiers, Peter A; de Villiers, Jill G

    2010-03-01

    This review addresses questions of what should be assessed in language acquisition, and how to do it. The design of a language assessment is crucially connected to its purpose, whether for diagnosis, development of an intervention plan, or for research. Precise profiles of language strengths and weaknesses are required for clear definitions of the phenotypes of particular language and neurodevelopmental disorders. The benefits and costs of formal tests versus language sampling assessments are reviewed. Content validity, theoretically and empirically grounded in child language acquisition, is claimed to be centrally important for appropriate assessment. Without this grounding, links between phenomena can be missed, and interpretations of underlying difficulties can be compromised. Sensitivity and specificity of assessment instruments are often assessed using a gold standard of existing tests and diagnostic practices, but problems arise if that standard is biased against particular groups or dialects. The paper addresses the issues raised by the goal of unbiased assessment of children from diverse linguistic and cultural backgrounds, especially speakers of non-mainstream dialects or bilingual children. A variety of new approaches are discussed for language assessment, including dynamic assessment, experimental tools such as intermodal preferential looking, and training studies that assess generalization. Stress is placed on the need for measures of the process of acquisition rather than just levels of achievement. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website.

  4. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received.

  5. The effects of an action's "age-of-acquisition" on action-sentence processing.

    PubMed

    Gilead, Michael; Liberman, Nira; Maril, Anat

    2016-11-01

    How does our brain allow us comprehend abstract/symbolic descriptions of human action? Whereas past research suggested that processing action language relies on sensorimotor brain regions, recent work suggests that sensorimotor activation depends on participants' task goals, such that focusing on abstract (vs. concrete) aspects of an action activates "default mode network" (rather than sensorimotor) regions. Following a Piagetian framework, we hypothesized that for actions acquired at an age wherein abstract/symbolic cognition is fully-developed, even when participants focus on the concrete aspects of an action, they should retrieve abstract-symbolic mental representations. In two studies, participants processed the concrete (i.e., "how") and abstract (i.e., "why") aspects of late-acquired and early-acquired actions. Consistent with previous research, focusing on the abstract (vs. concrete) aspects of an action resulted in greater activation in the "default mode network". Importantly, the activation in these regions was higher when processing later-acquired (vs. earlier acquired) actions-also when participants' goal was to focus on the concrete aspects of the action. We discuss the implications of the current findings to research on the involvement of concrete representations in abstract cognition.

  6. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  7. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Astrophysics Data System (ADS)

    Li, Y. T.; Wittenberg, L. J.

    1992-09-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  8. Age of second language acquisition affects nonverbal conflict processing in children: an fMRI study

    PubMed Central

    Mohades, Seyede Ghazal; Struys, Esli; Van Schuerbeek, Peter; Baeken, Chris; Van De Craen, Piet; Luypaert, Robert

    2014-01-01

    Background In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus–stimulus and stimulus–response conflicts. Three groups of 8–11-year-old children – bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) – were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life. PMID:25328840

  9. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Technical Reports Server (NTRS)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  10. A computer based wireless system for online acquisition, monitoring and digital processing of ECG waveforms.

    PubMed

    Bansal, Dipali; Khan, Munna; Salhan, Ashok K

    2009-04-01

    Various ECG instruments have addressed a wide variety of clinical and technical issues. However, there is still scope for improvement in them particularly in the area of their susceptibility to noise, lack of universal connectivity and off-line processing. A prototype system has been developed that caters to these limitations. It includes an analog system and a FM transceiver pair interfaced through sound port of the computer. The real time acquired data is viewed and filtered using MATLAB software. The ECG system described captures the bio-signal faithfully in real time wireless mode with minimum noise and has universal connectivity.

  11. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  12. Acquisition of high-resolution 3D data and processing using Artificial Intelligence

    NASA Astrophysics Data System (ADS)

    Meng, Hui; Sheng, J.; Yang, W.; Pu, Y.

    1996-11-01

    Holographic PIV (HPIV) is a promising 3D velocity field measurement technique providing high spatial-temporal resolution needed for understanding complex and turbulent flows. An HPIV system, combining in-line recording and off-axis viewing (IROV) holography and Heuristic Morphology Particle Pairing (HMPP) method, is being developed in this work. Unlike 2D PIV, HPIV instantaneously records a volume of particle images through holographic imaging. Its data processing involves special difficulties such as speckle noise, sparse pairs and large data sets. The HMPP algorithm is an adaptive parallel processing scheme applying artificial intelligence searching theory. Based on similar morphology of a particle group at successive instants separated by a small interval, HMPP matches a group of particle images between double exposures and provides velocity vectors for individual particle pairs, providing much higher spatial resolution than conventional correlation algorithm and lower measurement error caused by large velocity gradients. Taking advantages of IROV and HMPP, the system being developed appears highly promising as a practical HPIV configuration.

  13. PCI data acquisition and signal processing hardware modules for long pulse operation

    SciTech Connect

    Sousa, J.; Batista, A.J.N.; Combo, A.; Pereira, R.; Correia, Miguel; Cruz, N.; Carvalho, P.; Correia, Carlos; Varandas, C.A.F.

    2004-10-01

    A set of PCI instrumentation modules was developed at the EURATOM/IST Association. The modules were engineered around a reconfigurable hardware core which permits one to reduce the development time of instrument for new applications, provide support for long time or even continuous operation, and is able to perform real-time digital signal processing. The core was engineered at low cost and the modules incorporate a high number of channels, which contribute to reduce the total cost per channel. Field results are as expected in terms of performance both in data throughput and input characteristics. Currently, a 2 MSPS, 14-bit, eight channel galvanic isolated transient recorder; a 200 MSPS, 8-bit, four channel pulse digitizer; an eight channel time-to-digital-converter with a resolution of 0.4 ns, and a reconfigurable hardware expandable board, are implemented.

  14. IECON '87: Signal acquisition and processing; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Astrophysics Data System (ADS)

    Niederjohn, Russell J.

    1987-01-01

    Theoretical and applications aspects of signal processing are examined in reviews and reports. Topics discussed include speech processing methods, algorithms, and architectures; signal-processing applications in motor and power control; digital signal processing; signal acquisition and analysis; and processing algorithms and applications. Consideration is given to digital coding of speech algorithms, an algorithm for continuous-time processes in discrete-time measurement, quantization noise and filtering schemes for digital control systems, distributed data acquisition for biomechanics research, a microcomputer-based differential distance and velocity measurement system, velocity observations from discrete position encoders, a real-time hardware image preprocessor, and recognition of partially occluded objects by a knowledge-based system.

  15. The Acquisition of Cultural Competence: A Phenomenological Inquiry Highlighting the Processes, Challenges and Triumphs of Counselor Education Graduate Students

    ERIC Educational Resources Information Center

    Garner, Douglas L.

    2012-01-01

    Although research has effectively isolated and identified the key characteristics of a culturally competent counselor, there are few studies regarding the acquisition of these characteristics. To close the gap between theory and practice, studies are needed researching the emergence and, acquisition of these characteristics. This study explores…

  16. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalised cross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  17. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalisedcross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  18. Acquisition and processing pitfall with clipped traces in surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Gao, Lingli; Pan, Yudi

    2016-02-01

    Multichannel analysis of surface waves (MASW) is widely used in estimating near-surface shear (S)-wave velocity. In the MASW method, generating a reliable dispersion image in the frequency-velocity (f-v) domain is an important processing step. A locus along peaks of dispersion energy at different frequencies allows the dispersion curves to be constructed for inversion. When the offsets are short, the output seismic data may exceed the dynamic ranges of geophones/seismograph, as a result of which, peaks and (or) troughs of traces will be squared off in recorded shot gathers. Dispersion images generated by the raw shot gathers with clipped traces would be contaminated by artifacts, which might be misidentified as Rayleigh-wave phase velocities or body-wave velocities and potentially lead to incorrect results. We performed some synthetic models containing clipped traces, and analyzed amplitude spectra of unclipped and clipped waves. The results indicate that artifacts in the dispersion image are dependent on the level of clipping. A real-world example also shows how clipped traces would affect the dispersion image. All the results suggest that clipped traces should be removed from the shot gathers before generating dispersion images, in order to pick accurate phase velocities and set reasonable initial inversion models.

  19. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  20. Pre-PCR processing: strategies to generate PCR-compatible samples.

    PubMed

    Rådström, Peter; Knutsson, Rickard; Wolffs, Petra; Lövenklev, Maria; Löfström, Charlotta

    2004-02-01

    Polymerase chain reaction (PCR) is recognized as a rapid, sensitive, and specific molecular diagnostic tool for the analysis of nucleic acids. However, the sensitivity and kinetics of diagnostic PCR may be dramatically reduced when applied directly to biological samples, such as blood and feces, owing to PCR-inhibitory components. As a result, pre-PCR processing procedures have been developed to remove or reduce the effects of PCR inhibitors. Pre-PCR processing comprises all steps prior to the detection of PCR products, that is, sampling, sample preparation, and deoxyribonucleic acid (DNA) amplification. The aim of pre-PCR processing is to convert a complex biological sample with its target nucleic acids/cells into PCR-amplifiable samples by combining sample preparation and amplification conditions. Several different pre-PCR processing strategies are used: (1) optimization of the DNA amplification conditions by the use of alternative DNA polymerases and/or amplification facilitators, (2) optimization of the sample preparation method, (3) optimization of the sampling method, and (4) combinations of the different strategies. This review describes different pre-PCR processing strategies to circumvent PCR inhibition to allow accurate and precise DNA amplification.

  1. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  2. Improving DoD Energy Efficiency: Combining MMOWGLI Social-Media Brainstorming with Lexical Link Analysis (LLA) to Strengthen the Defense Acquisition Process

    DTIC Science & Technology

    2013-04-01

    from MIT and co-founded Quantum Intelligence, Inc. She has been a principal investigator (PI) for six DoD Small Business Innovation Research (SBIR... consciously and consistently.  Following a series of deliberate experiments, long-term procedural improvements to the formal milestone acquisition process

  3. Since When or How Often? Dissociating the Roles of Age of Acquisition (AoA) and Lexical Frequency in Early Visual Word Processing

    ERIC Educational Resources Information Center

    Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado

    2013-01-01

    The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…

  4. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples.

    PubMed

    Aristov, Alexander; Nosova, Ekaterina

    2017-04-01

    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  5. Modular tube/plate-based sample management: a business model optimized for scalable storage and processing.

    PubMed

    Fillers, W Steven

    2004-12-01

    Modular approaches to sample management allow staged implementation and progressive expansion of libraries within existing laboratory space. A completely integrated, inert atmosphere system for the storage and processing of a variety of microplate and microtube formats is currently available as an integrated series of individual modules. Liquid handling for reformatting and replication into microplates, plus high-capacity cherry picking, can be performed within the inert environmental envelope to maximize compound integrity. Complete process automation provides ondemand access to samples and improved process control. Expansion of such a system provides a low-risk tactic for implementing a large-scale storage and processing system.

  6. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 6 2012-07-01 2012-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  7. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 6 2014-07-01 2014-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  8. 32 CFR 806.27 - Samples of Air Force FOIA processing documents.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Samples of Air Force FOIA processing documents. 806.27 Section 806.27 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR FORCE ADMINISTRATION AIR FORCE FREEDOM OF INFORMATION ACT PROGRAM § 806.27 Samples of Air Force FOIA...

  9. Integrated Processing of High Resolution Topographic Data for Soil Erosion Assessment Considering Data Acquisition Schemes and Surface Properties

    NASA Astrophysics Data System (ADS)

    Eltner, A.; Schneider, D.; Maas, H.-G.

    2016-06-01

    Soil erosion is a decisive earth surface process strongly influencing the fertility of arable land. Several options exist to detect soil erosion at the scale of large field plots (here 600 m²), which comprise different advantages and disadvantages depending on the applied method. In this study, the benefits of unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are exploited to quantify soil surface changes. Beforehand data combination, TLS data is co-registered to the DEMs generated with UAV photogrammetry. TLS data is used to detect global as well as local errors in the DEMs calculated from UAV images. Additionally, TLS data is considered for vegetation filtering. Complimentary, DEMs from UAV photogrammetry are utilised to detect systematic TLS errors and to further filter TLS point clouds in regard to unfavourable scan geometry (i.e. incidence angle and footprint) on gentle hillslopes. In addition, surface roughness is integrated as an important parameter to evaluate TLS point reliability because of the increasing footprints and thus area of signal reflection with increasing distance to the scanning device. The developed fusion tool allows for the estimation of reliable data points from each data source, considering the data acquisition geometry and surface properties, to finally merge both data sets into a single soil surface model. Data fusion is performed for three different field campaigns at a Mediterranean field plot. Successive DEM evaluation reveals continuous decrease of soil surface roughness, reappearance of former wheel tracks and local soil particle relocation patterns.

  10. A new and practical method to obtain grain size measurements in sandy shores based on digital image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Baptista, P.; Cunha, T. R.; Gama, C.; Bernardes, C.

    2012-12-01

    Modern methods for the automated evaluation of sediment size in sandy shores relay on digital image processing algorithms as an alternative to time-consuming traditional sieving methodologies. However, the requirements necessary to guarantee that the considered image processing algorithm has a good grain identification success rate impose the need for dedicated hardware setups to capture the sand surface images. Examples are specially designed camera housings that maintain a constant distance between the camera lens and the sand surface, tripods to fix and maintain the camera angle orthogonal to the sand surface, external illumination systems that guarantee the light level necessary for the image processing algorithms, and special lenses and focusing systems for close proximity image capturing. In some cases, controlled image-capturing conditions can make the fieldwork more laborious which incurs in significant costs for monitoring campaigns considering large areas. To circumvent this problem, it is proposed a new automated image-processing algorithm that identifies sand grains in digital images acquired with a standard digital camera without any extra hardware attached to it. The accuracy and robustness of the proposed algorithm are evaluated in this work by means of a laboratory test on previously controlled grain samples, field tests where 64 samples (spread over a beach stretch of 65 km and with grain size ranging from 0.5 mm to 1.9 mm) were processed by both the proposed method and by sieving and finally by manual point count on all acquired images. The calculated root-mean-square (RMS) error between mean grain sizes obtained from the proposed image processing method and the sieve method (for the 64 samples) was 0.33 mm, and for the image processing method versus manual point counts comparison, with the same images, was 0.12 mm. The achieved correlation coefficients (r) were 0.91 and 0.96, respectively.

  11. Simultaneous acquisition of 2D and 3D solid-state NMR experiments for sequential assignment of oriented membrane protein samples.

    PubMed

    Gopinath, T; Mote, Kaustubh R; Veglia, Gianluigi

    2015-05-01

    We present a new method called DAISY (Dual Acquisition orIented ssNMR spectroScopY) for the simultaneous acquisition of 2D and 3D oriented solid-state NMR experiments for membrane proteins reconstituted in mechanically or magnetically aligned lipid bilayers. DAISY utilizes dual acquisition of sine and cosine dipolar or chemical shift coherences and long living (15)N longitudinal polarization to obtain two multi-dimensional spectra, simultaneously. In these new experiments, the first acquisition gives the polarization inversion spin exchange at the magic angle (PISEMA) or heteronuclear correlation (HETCOR) spectra, the second acquisition gives PISEMA-mixing or HETCOR-mixing spectra, where the mixing element enables inter-residue correlations through (15)N-(15)N homonuclear polarization transfer. The analysis of the two 2D spectra (first and second acquisitions) enables one to distinguish (15)N-(15)N inter-residue correlations for sequential assignment of membrane proteins. DAISY can be implemented in 3D experiments that include the polarization inversion spin exchange at magic angle via I spin coherence (PISEMAI) sequence, as we show for the simultaneous acquisition of 3D PISEMAI-HETCOR and 3D PISEMAI-HETCOR-mixing experiments.

  12. 48 CFR 873.105 - Acquisition planning.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Acquisition planning. 873.105 Section 873.105 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS DEPARTMENT... planning. (a) Acquisition planning is an indispensable component of the total acquisition process. (b)...

  13. Department of Defense and Service Requirements for Human Factors R&D in the Military System Acquisition Process

    DTIC Science & Technology

    1980-07-01

    alternative programs, including anticipated performance information. e A summary of the acquisition strategy. * Short- and long-term business planning information...Short- and long-term business planning effectively supports the acquisition strategy. * Producibility and areas of production risks have been adequately...and is being executed in the conduct of program management. e Short- and long-term business planning supports the strategy. Contract types are

  14. Unexpected toxicity to aquatic organisms of some aqueous bisphenol A samples treated by advanced oxidation processes.

    PubMed

    Tišler, Tatjana; Erjavec, Boštjan; Kaplan, Renata; Şenilă, Marin; Pintar, Albin

    2015-01-01

    In this study, photocatalytic and catalytic wet-air oxidation (CWAO) processes were used to examine removal efficiency of bisphenol A from aqueous samples over several titanate nanotube-based catalysts. Unexpected toxicity of bisphenol A (BPA) samples treated by means of the CWAO process to some tested species was determined. In addition, the CWAO effluent was recycled five- or 10-fold in order to increase the number of interactions between the liquid phase and catalyst. Consequently, the inductively coupled plasma mass spectrometry (ICP-MS) analysis indicated higher concentrations of some toxic metals like chromium, nickel, molybdenum, silver, and zinc in the recycled samples in comparison to both the single-pass sample and the photocatalytically treated solution. The highest toxicity of five- and 10-fold recycled solutions in the CWAO process was observed in water fleas, which could be correlated to high concentrations of chromium, nickel, and silver detected in tested samples. The obtained results clearly demonstrated that aqueous samples treated by means of advanced oxidation processes should always be analyzed using (i) chemical analyses to assess removal of BPA and total organic carbon from treated aqueous samples, as well as (ii) a battery of aquatic organisms from different taxonomic groups to determine possible toxicity.

  15. Capillary absorption spectrometer and process for isotopic analysis of small samples

    SciTech Connect

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  16. Three-Step Validation of Exercise Behavior Processes of Change in an Adolescent Sample

    ERIC Educational Resources Information Center

    Rhodes, Ryan E.; Berry, Tanya; Naylor, Patti-Jean; Higgins, S. Joan Wharf

    2004-01-01

    Though the processes of change are conceived as the core constructs of the transtheoretical model (TTM), few researchers have examined their construct validity in the physical activity domain. Further, only 1 study was designed to investigate the processes of change in an adolescent sample. The purpose of this study was to examine the exercise…

  17. Effect of histologic processing on dimensions of skin samples obtained from cat cadavers.

    PubMed

    Jeyakumar, Sakthila; Smith, Annette N; Schleis, Stephanie E; Cattley, Russell C; Tillson, D Michael; Henderson, Ralph A

    2015-11-01

    OBJECTIVE To determine changes in dimensions of feline skin samples as a result of histologic processing and to identify factors that contributed to changes in dimensions of skin samples after sample collection. SAMPLE Cadavers of 12 clinically normal cats. PROCEDURES Skin samples were obtained bilaterally from 3 locations (neck, thorax, and tibia) of each cadaver; half of the thoracic samples included underlying muscle. Length, width, and depth were measured at 5 time points (before excision, after excision, after application of ink to mark tissue margins, after fixation in neutral-buffered 10% formalin for 36 hours, and after completion of histologic processing and staining with H&E stain). Measurements obtained after sample collection were compared with measurements obtained before excision. RESULTS At the final time point, tissue samples had decreased in length (mean decrease, 32.40%) and width (mean decrease, 34.21%) and increased in depth (mean increase, 54.95%). Tissue from the tibia had the most shrinkage in length and width and that from the neck had the least shrinkage. Inclusion of underlying muscle on thoracic skin samples did not affect the degree of change in dimensions. CONCLUSIONS AND CLINICAL RELEVANCE In this study, each step during processing from excision to formalin fixation and histologic processing induced changes in tissue dimensions, which were manifested principally as shrinkage in length and width and increase in depth. Most of the changes occured during histologic processing. Inclusion of muscle did not affect thoracic skin shrinkage. Shrinkage should be a consideration when interpreting surgical margins in clinical cases. 945).

  18. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  19. Down sampled signal processing for a B Factory bunch-by-bunch feedback system

    SciTech Connect

    Hindi, H.; Hosseini, W.; Briggs, D.; Fox, J.; Hutton, A.

    1992-03-01

    A bunch-by-bunch feedback scheme is studied for damping coupled bunch synchrotron oscillations in the proposed PEP II B Factory. The quasi-linear feedback systems design incorporates a phase detector to provide a quantized measure of bunch phase, digital signal processing to compute an error correction signal and a kicker system to correct the energy of the bunches. A farm of digital processors, operating in parallel, is proposed to compute correction signals for the 1658 bunches of the B Factory. This paper studies the use of down sampled processing to reduce the computational complexity of the feedback system. We present simulation results showing the effect of down sampling on beam dynamics. Results show that down sampled processing can reduce the scale of the processing task by a factor of 10.

  20. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY'S (DWPF) PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 7A QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Edwards, T.; Jones, M.; Wiedenman, B.

    2011-03-14

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO{sub 3} acid dissolution (i.e., DWPF Cold Chem Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestions of Sludge Batch 7a (SB7a) SRAT Receipt and SB7a SRAT Product samples. The SB7a SRAT Receipt and SB7a SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constituates the SB7a Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 6 (SB6), to form the Sb7a Blend composition.

  1. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  2. Technical Note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, T.; Bach, L. T.; Czerny, J.; Riebesell, U.

    2015-11-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial decoupling between particle formation and collection often handicaps reconciliation of these two processes even within the euphotic zone. Pelagic mesocosms have the advantage of being closed systems and are therefore ideally suited to study how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently concentrated by passive settling, centrifugation or flocculation with ferric chloride and we discuss the advantages of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranges from fine to coarse silt (2-63 μm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements and even at very low particle fluxes we were able to get a detailed insight on various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  3. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  4. Studies on the Electro-Polishing process with Nb sample plates at KEK

    SciTech Connect

    Saeki, Takayuki; Funahashi, Y.; Hayano, Hitoshi; Kato, Seigo; Nishiwaki, Michiru; Sawabe, Motoaki; Ueno, Kenji; Watanabe, K.; Clemens, William A.; Geng, Rongli; Manus, Robert L.; Tyagi, Puneet

    2009-11-01

    In this article, two subjects would be described. the first subject is on the production of stains on the surface of Nb sample plates in Electro-polishing (EP) process and the second subject is on the development of defects/pits in the EP process on the surface of a Nb sample plate. Recently, some 9-cell cavities were treated with new EP acid at KEK and the performance of these cavities were limited by heavy field emissions. On the inside surface of these cavities, brown stains were observed. We made an effort to reproduce the brown stains on Nb sample plates with an EP setup in laboratory with varying the concentration of Nibium in the EP acid. We found that the brown stains would appear only when processed with new EP acid. In the second subject, we made artificial pits on the surface of a Nb-sample plate and observed the development of the pits after each step of 30um-EP process where 120um was removed in total by the EP process. This article describes these series EP-tests with Nb sample plates at KEK.

  5. Extended Characterization of Chemical Processes in Hot Cells Using Environmental Swipe Samples

    SciTech Connect

    Olsen, Khris B.; Mitroshkov, Alexandre V.; Thomas, M-L; Lepel, Elwood A.; Brunson, Ronald R.; Ladd-Lively, Jennifer

    2012-09-15

    Environmental sampling is used extensively by the International Atomic Energy Agency (IAEA) for verification of information from State declarations or a facility’s design regarding nuclear activities occurring within the country or a specific facility. Environmental sampling of hot cells within a facility under safeguards is conducted using 10.2 cm x 10.2 cm cotton swipe material or cellulose swipes. Traditional target analytes used by the IAEA to verify operations within a facility include a select list of gamma-emitting radionuclides and total and isotopic U and Pu. Analysis of environmental swipe samples collected within a hot-cell facility where chemical processing occurs may also provide information regarding specific chemicals used in fuel processing. However, using swipe material to elucidate what specific chemical processes were/are being used within a hot cell has not been previously evaluated. Staff from Pacific Northwest National Laboratory (PNNL) and Oak Ridge National Laboratory (ORNL) teamed to evaluate the potential use of environmental swipe samples as collection media for volatile and semivolatile organic compounds. This evaluation was initiated with sample collection during a series of Coupled End-to-End (CETE) reprocessing runs at ORNL. The study included measurement of gamma emitting radionuclides, total and isotopic U and Pu, and volatile and semivolatile organic compounds. These results allowed us to elucidate what chemical processes used in the hot cells during reprocessing of power reactor and identify other legacy chemicals used in hot cell operations which predate the CETE process.

  6. Random Sampling Process Leads to Overestimation of β-Diversity of Microbial Communities

    PubMed Central

    Zhou, Jizhong; Jiang, Yi-Huei; Deng, Ye; Shi, Zhou; Zhou, Benjamin Yamin; Xue, Kai; Wu, Liyou; He, Zhili; Yang, Yunfeng

    2013-01-01

    ABSTRACT The site-to-site variability in species composition, known as β-diversity, is crucial to understanding spatiotemporal patterns of species diversity and the mechanisms controlling community composition and structure. However, quantifying β-diversity in microbial ecology using sequencing-based technologies is a great challenge because of a high number of sequencing errors, bias, and poor reproducibility and quantification. Herein, based on general sampling theory, a mathematical framework is first developed for simulating the effects of random sampling processes on quantifying β-diversity when the community size is known or unknown. Also, using an analogous ball example under Poisson sampling with limited sampling efforts, the developed mathematical framework can exactly predict the low reproducibility among technically replicate samples from the same community of a certain species abundance distribution, which provides explicit evidences of random sampling processes as the main factor causing high percentages of technical variations. In addition, the predicted values under Poisson random sampling were highly consistent with the observed low percentages of operational taxonomic unit (OTU) overlap (<30% and <20% for two and three tags, respectively, based on both Jaccard and Bray-Curtis dissimilarity indexes), further supporting the hypothesis that the poor reproducibility among technical replicates is due to the artifacts associated with random sampling processes. Finally, a mathematical framework was developed for predicting sampling efforts to achieve a desired overlap among replicate samples. Our modeling simulations predict that several orders of magnitude more sequencing efforts are needed to achieve desired high technical reproducibility. These results suggest that great caution needs to be taken in quantifying and interpreting β-diversity for microbial community analysis using next-generation sequencing technologies. PMID:23760464

  7. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  8. Occurrence of Arcobacter in Iranian poultry and slaughterhouse samples implicates contamination by processing equipment and procedures.

    PubMed

    Khoshbakht, R; Tabatabaei, M; Shirzad Aski, H; Seifi, S

    2014-01-01

    1. The occurrence of Arcobacter spp. and three pathogenic species of Arcobacter from Iranian poultry carcasses was investigated at different steps of broiler processing to determine critical control points for reducing carcass contamination. 2. Samples were collected from (a) cloaca immediately before processing, (b) different points during processing and (c) at different stations in a processing plant of a slaughterhouse in southern Iran. 3. After enrichment steps in Arcobacter selective broth, DNA of the samples was extracted and three significant pathogen species of Arcobacter were identified based on polymerase chain reaction (PCR) detection of 16S rRNA and specific species PCR. 4. Out of a total of 540 samples, 244 (45%) were positive for Arcobacter spp. Arcobacter butzleri was more frequently detected (73% ± 13.9%) than A. cryaeophilus (9% ± 13.9%) and A. skirrowii (4.1%). In addition, co-colonisation (A. butzleri and A. cryaerophilus) occurred in 13.9% of the positive samples. 5. The results indicate a high prevalence of Arcobacter in the investigated slaughterhouse and broiler carcasses and that Arcobacter is not a normal flora of the broilers. Evidence for the presence of Arcobacter in the environment and water of processing plants suggests that these are sources of contamination of poultry carcasses. In addition, contamination of the poultry carcasses can spread between poultry meats in different parts and processes of the slaughterhouse (pre-scalding to after evisceration).

  9. Developing a simple method to process bone samples prior to DNA isolation.

    PubMed

    Li, Richard; Chapman, Sandra; Thompson, Mary; Schwartz, Michal

    2009-03-01

    Bone tissue is often used for recovering DNA samples for the purpose of human identification. However, the initial cleaning and sampling of the bone specimen is a labor-intensive and time-consuming step, which must be completed prior to isolating DNA. Thus, it is difficult to adapt the current method for automation. To address this issue, we have developed a simple processing method using a trypsin treatment prior to DNA isolation. The use of the trypsin-based procedure potentially reduces the amount of labor required by a physical method such as sanding. By incubating samples with the trypsin solution, the soft tissue and outer surface of the bone fragment samples are removed. The processed bone fragment or a portion of the fragment can then be used for DNA isolation.

  10. Managing Radical Change in Acquisition

    DTIC Science & Technology

    1998-01-01

    some process innovation, acquisition continues to plague the Defense System and constrain battlefield mobility, information, and speed. Following the... System & Acquisition Management,Monterey,CA,93943 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES...Mark Nissen is Assistant Professor of Information Systems and Acquisition Management at the Naval Postgraduate School (NPS) in Monterey, CA. He

  11. [Preparation of samples for proficiency testing of pesticide residue analysis in processed foods].

    PubMed

    Okihashi, Masahiro; Osakada, Masakazu; Uchida, Kotaro; Nagayoshi, Haruna; Yamaguchi, Takahiro; Kakimoto, Kensaku; Nakayama, Yukiko; Obana, Hirotaka

    2010-01-01

    To conduct proficiency testing for the analysis of pesticide residues in processed foods, fortified samples of retort curry and pancake were examined. In the case of retort curry, heating and mixing were necessary at the time of preparation to provide a homogenous analytical sample. A mixture of 4 carbamates and 11 organophosphorus pesticides was spiked and 14 of them showed consistent results in the samples. In the case of pancake, 10 kinds of pesticides were added to the pastry. The prepared pastry was them cooked. The relative concentrations of most of the pesticides in the pancake were not affected and all the pesticides showed consistent results in the samples. These results showed that the two tested samples were suitable for proficiency testing.

  12. Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling

    PubMed Central

    Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.

    2012-01-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055

  13. How can wireless, mobile data acquisition be used for taking part of the lab to the sample, and how can it join the internet of things?

    NASA Astrophysics Data System (ADS)

    Trzcinski, Peter; Karanassios, Vassili

    2016-05-01

    During the last several years, the world has moved from wired communications (e.g., a wired ethernet, wired telephone) to wireless communications (e.g., cell phones, smart phones, tablets). However, data acquisition has lagged behind and for the most part, data in laboratory settings are still acquired using wired communications (or even plug in boards). In this paper, approaches that can be used for wireless data acquisition are briefly discussed using a conceptual model of a future, mobile, portable micro-instrument as an example. In addition, past, present and near-future generations of communications are discussed; processors, operating systems and benchmarks are reviewed; networks that may be used for data acquisition in the field are examined; and, the possibility of connecting sensor or micro-instrument networks to the internet of things is postulated.

  14. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  15. The Symbolic World of the Bilingual Child: Digressions on Language Acquisition, Culture and the Process of Thinking

    ERIC Educational Resources Information Center

    Nowak-Fabrykowski, Krystyna; Shkandrij, Miroslav

    2004-01-01

    In this paper we explore the relationship between language acquisition, and the construction of a symbolic world. According to Bowers (1989) language is a collection of patterns regulating social life. This conception is close to that of Symbolic Interactionists (Charon, 1989) who see society as made up of interacting individuals who are symbol…

  16. The Effect of Age of Second Language Acquisition on the Representation and Processing of Second Language Words

    ERIC Educational Resources Information Center

    Silverberg, Stu; Samuel, Arthur G.

    2004-01-01

    In this study, the effects of second language (i.e., L2) proficiency and age of second language acquisition are assessed. Three types of bilinguals are compared: Early L2 learners, Late highly proficient L2 learners, and Late less proficient L2 learners. A lexical decision priming paradigm is used in which the critical trials consist of first…

  17. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  18. Faulting processes in active faults - Evidences from TCDP and SAFOD drill core samples

    SciTech Connect

    Janssen, C.; Wirth, R.; Wenk, H. -R.; Morales, L.; Naumann, R.; Kienast, M.; Song, S. -R.; Dresen, G.

    2014-08-20

    The microstructures, mineralogy and chemistry of representative samples collected from the cores of the San Andreas Fault drill hole (SAFOD) and the Taiwan Chelungpu-Fault Drilling project (TCDP) have been studied using optical microscopy, TEM, SEM, XRD and XRF analyses. SAFOD samples provide a transect across undeformed host rock, the fault damage zone and currently active deforming zones of the San Andreas Fault. TCDP samples are retrieved from the principal slip zone (PSZ) and from the surrounding damage zone of the Chelungpu Fault. Substantial differences exist in the clay mineralogy of SAFOD and TCDP fault gouge samples. Amorphous material has been observed in SAFOD as well as TCDP samples. In line with previous publications, we propose that melt, observed in TCDP black gouge samples, was produced by seismic slip (melt origin) whereas amorphous material in SAFOD samples was formed by comminution of grains (crush origin) rather than by melting. Dauphiné twins in quartz grains of SAFOD and TCDP samples may indicate high seismic stress. The differences in the crystallographic preferred orientation of calcite between SAFOD and TCDP samples are significant. Microstructures resulting from dissolution–precipitation processes were observed in both faults but are more frequently found in SAFOD samples than in TCDP fault rocks. As already described for many other fault zones clay-gouge fabrics are quite weak in SAFOD and TCDP samples. Clay-clast aggregates (CCAs), proposed to indicate frictional heating and thermal pressurization, occur in material taken from the PSZ of the Chelungpu Fault, as well as within and outside of the SAFOD deforming zones, indicating that these microstructures were formed over a wide range of slip rates.

  19. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling.

    PubMed

    Boone, Eric J; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B; Stirm, Brian H; Pratt, Kerri A

    2015-07-21

    Cloudwater and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry (HRMS) with nanospray desorption electrospray ionization (nano-DESI) and direct infusion electrospray ionization (ESI) were utilized to compare the organic composition of the particle and cloudwater samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloudwater, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloudwater samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloudwater when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol.

  20. Aqueous Processing of Atmospheric Organic Particles in Cloud Water Collected via Aircraft Sampling

    SciTech Connect

    Boone, Eric J.; Laskin, Alexander; Laskin, Julia; Wirth, Christopher; Shepson, Paul B.; Stirm, Brian H.; Pratt, Kerri A.

    2015-07-21

    Cloud water and below-cloud atmospheric particle samples were collected onboard a research aircraft during the Southern Oxidant and Aerosol Study (SOAS) over a forested region of Alabama in June 2013. The organic molecular composition of the samples was studied to gain insights into the aqueous-phase processing of organic compounds within cloud droplets. High resolution mass spectrometry with nanospray desorption electrospray ionization and direct infusion electrospray ionization were utilized to compare the organic composition of the particle and cloud water samples, respectively. Isoprene and monoterpene-derived organosulfates and oligomers were identified in both the particles and cloud water, showing the significant influence of biogenic volatile organic compound oxidation above the forested region. While the average O:C ratios of the organic compounds were similar between the atmospheric particle and cloud water samples, the chemical composition of these samples was quite different. Specifically, hydrolysis of organosulfates and formation of nitrogen-containing compounds were observed for the cloud water when compared to the atmospheric particle samples, demonstrating that cloud processing changes the composition of organic aerosol.

  1. 21 CFR 211.110 - Sampling and testing of in-process materials and drug products.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sampling and testing of in-process materials and drug products. 211.110 Section 211.110 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR...

  2. Sampling and Hydrogeology of the Vadose Zone Beneath the 300 Area Process Ponds

    SciTech Connect

    Bjornstad, Bruce N.

    2004-08-31

    Four open pits were dug with a backhoe into the vadose zone beneath the former 300 Area Process Ponds in April 2003. Samples were collected about every 2 feet for physical, chemical, and/or microbiological characterization. This reports presents a stratigraphic and geohydrologic summary of the four excavations.

  3. Pippi — Painless parsing, post-processing and plotting of posterior and likelihood samples

    NASA Astrophysics Data System (ADS)

    Scott, Pat

    2012-11-01

    Interpreting samples from likelihood or posterior probability density functions is rarely as straightforward as it seems it should be. Producing publication-quality graphics of these distributions is often similarly painful. In this short note I describe pippi, a simple, publicly available package for parsing and post-processing such samples, as well as generating high-quality PDF graphics of the results. Pippi is easily and extensively configurable and customisable, both in its options for parsing and post-processing samples, and in the visual aspects of the figures it produces. I illustrate some of these using an existing supersymmetric global fit, performed in the context of a gamma-ray search for dark matter. Pippi can be downloaded and followed at http://github.com/patscott/pippi.

  4. Proposal for field sampling of plants and processing in the lab for environmental metabolic fingerprinting

    PubMed Central

    2010-01-01

    Background Samples for plant metabolic fingerprinting are prepared generally by metabolism quenching, grinding of plant material and extraction of metabolites in solvents. Further concentration and derivatisation steps follow in dependence of the sample nature and the available analytical platform. For plant material sampled in the field, several methods are not applicable, such as, e.g., collection in liquid nitrogen. Therefore, a protocol was established for sample pre-treatment, grinding, extraction and storage, which can be used for analysis of field-collected plant material, which is further processed in the laboratory. Ribwort plantain (Plantago lanceolata L., Plantaginaceae) was used as model plant. The quality criteria for method suitability were high reproducibility, extraction efficiency and handling comfort of each subsequent processing step. Results Highest reproducibility of results was achieved by sampling fresh plant material in a solvent mixture of methanol:dichloromethane (2:1), crushing the tissue with a hand-held disperser and storing the material until further processing. In the laboratory the material was extracted threefold at different pH. The gained extracts were separated with water (2:1:1 methanol:dichloromethane:water) and the aqueous phases used for analysis by LC-MS, because the polar metabolites were in focus. Chromatograms were compared by calculating a value Ξ for similarities. Advantages and disadvantages of different sample pre-treatment methods, use of solvents and solvent mixtures, influence of pH, extraction frequency and duration, and storing temperature are discussed with regard to the quality criteria. Conclusions The proposed extraction protocol leads to highly reproducible metabolic fingerprints and allows optimal handling of field-collected plant material and further processing in the laboratory, which is demonstrated for an exemplary field data-set. Calculation of Ξ values is a useful tool to judge similarities between

  5. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings

    PubMed Central

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-01-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/− 2°C following the ramp up. The system is demonstrated to provide linear results between 104 and 108 CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection. PMID:27231636

  6. Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts.

    PubMed

    Brønd, Jan Christian; Arvidsson, Daniel

    2016-02-01

    ActiGraph acceleration data are processed through several steps (including band-pass filtering to attenuate unwanted signal frequencies) to generate the activity counts commonly used in physical activity research. We performed three experiments to investigate the effect of sampling frequency on the generation of activity counts. Ideal acceleration signals were produced in the MATLAB software. Thereafter, ActiGraph GT3X+ monitors were spun in a mechanical setup. Finally, 20 subjects performed walking and running wearing GT3X+ monitors. Acceleration data from all experiments were collected with different sampling frequencies, and activity counts were generated with the ActiLife software. With the default 30-Hz (or 60-Hz, 90-Hz) sampling frequency, the generation of activity counts was performed as intended with 50% attenuation of acceleration signals with a frequency of 2.5 Hz by the signal frequency band-pass filter. Frequencies above 5 Hz were eliminated totally. However, with other sampling frequencies, acceleration signals above 5 Hz escaped the band-pass filter to a varied degree and contributed to additional activity counts. Similar results were found for the spinning of the GT3X+ monitors, although the amount of activity counts generated was less, indicating that raw data stored in the GT3X+ monitor is processed. Between 600 and 1,600 more counts per minute were generated with the sampling frequencies 40 and 100 Hz compared with 30 Hz during running. Sampling frequency affects the processing of ActiGraph acceleration data to activity counts. Researchers need to be aware of this error when selecting sampling frequencies other than the default 30 Hz.

  7. Solar-thermal complex sample processing for nucleic acid based diagnostics in limited resource settings.

    PubMed

    Gumus, Abdurrahman; Ahsan, Syed; Dogan, Belgin; Jiang, Li; Snodgrass, Ryan; Gardner, Andrea; Lu, Zhengda; Simpson, Kenneth; Erickson, David

    2016-05-01

    The use of point-of-care (POC) devices in limited resource settings where access to commonly used infrastructure, such as water and electricity, can be restricted represents simultaneously one of the best application fits for POC systems as well as one of the most challenging places to deploy them. Of the many challenges involved in these systems, the preparation and processing of complex samples like stool, vomit, and biopsies are particularly difficult due to the high number and varied nature of mechanical and chemical interferents present in the sample. Previously we have demonstrated the ability to use solar-thermal energy to perform PCR based nucleic acid amplifications. In this work demonstrate how the technique, using similar infrastructure, can also be used to perform solar-thermal based sample processing system for extracting and isolating Vibrio Cholerae nucleic acids from fecal samples. The use of opto-thermal energy enables the use of sunlight to drive thermal lysing reactions in large volumes without the need for external electrical power. Using the system demonstrate the ability to reach a 95°C threshold in less than 5 minutes and maintain a stable sample temperature of +/- 2°C following the ramp up. The system is demonstrated to provide linear results between 10(4) and 10(8) CFU/mL when the released nucleic acids were quantified via traditional means. Additionally, we couple the sample processing unit with our previously demonstrated solar-thermal PCR and tablet based detection system to demonstrate very low power sample-in-answer-out detection.

  8. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE PAGES

    Moran, James; Alexander, Thomas; Aalseth, Craig; ...

    2017-01-26

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. Here, we present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We also identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133 Bq of total T activity. Furthermore, this enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps inmore » our understanding of both natural and artificial T behavior in the environment.« less

  9. Pesticide-sampling equipment, sample-collection and processing procedures, and water-quality data at Chicod Creek, North Carolina, 1992

    USGS Publications Warehouse

    Manning, T.K.; Smith, K.E.; Wood, C.D.; Williams, J.B.

    1994-01-01

    Water-quality samples were collected from Chicod Creek in the Coastal Plain Province of North Carolina during the summer of 1992 as part of the U.S. Geological Survey's National Water-Quality Assessment Program. Chicod Creek is in the Albemarle-Pamlico drainage area, one of four study units designated to test equipment and procedures for collecting and processing samples for the solid-phase extraction of selected pesticides, The equipment and procedures were used to isolate 47 pesticides, including organonitrogen, carbamate, organochlorine, organophosphate, and other compounds, targeted to be analyzed by gas chromatography/mass spectrometry. Sample-collection and processing equipment equipment cleaning and set-up procedures, methods pertaining to collecting, splitting, and solid-phase extraction of samples, and water-quality data resulting from the field test are presented in this report Most problems encountered during this intensive sampling exercise were operational difficulties relating to equipment used to process samples.

  10. A comparison between data processing techniques for FTS based on high frequency interferogram sampling

    NASA Astrophysics Data System (ADS)

    Panzeri, R.; Saggin, S.; Scaccabarozzi, D.; Tarabini, M.

    2016-10-01

    This paper compares different data processing techniques for FTS with the aim of assessing the feasibility of a spectrometer leveraging on standard DAC boards, without dedicated hardware for sampling and speed control of the moving mirrors. Fourier transform spectrometers rely on the sampling of the interferogram at constant steps of the optical path difference (OPD) to evaluate the spectra through standard discrete Fourier transform. Constant OPD sampling is traditionally achieved with dedicated hardware but, recently, sampling methods based on the use of common analog to digital converters with large dynamic range and high sampling frequency have become viable when associated with specific data processing techniques. These methods offer advantages from the point of view of insensitivity to disturbances, in particular mechanical vibrations, and should be less sensitive to OPD speed errors. In this work the performances of three algorithms, two taken from literature based on phase demodulation of a reference interferogram have been compared with a method based on direct phase computation of the reference interferogram in terms of robustness against mechanical vibrations and OPD speed errors. All methods provided almost correct spectra with vibrations amplitudes up to 10% of the average OPD speed and speed drifts within the scan up to 20% of the average, as long as the disturbance frequency was lower than the reference signal nominal one. The developed method based on the arccosine function keeps working also with frequencies of the disturbances larger than the reference channel one, the common limit for the other two.

  11. The effect of humidity on samples of microcrystalline cellulose taken from the extrusion/marumerization process.

    PubMed

    Mayville, F C; Atassi, F; Wigent, R J; Schwartz, J B

    1999-01-01

    The purpose of this work was to examine the sorption and desorption of water by various samples of microcrystalline cellulose, MCC (Avicel PH-101), taken from the extrusion/marumerization process, and to provide data that may explain how water affects the MCC polymer matrix during the formation of beads. Two isopiestic (humidity) studies were conducted: the first used samples exposed directly to controlled humidity conditions, whereas the second used samples that were freeze-dried before being exposed to controlled humidity conditions. Water sorption and desorption were determined gravimetrically. When both sets of samples were initially exposed to low-humidity conditions, they reached equilibrium by desorbing water. When these samples were initially exposed to high-humidity conditions, the high moisture content samples desorbed water, whereas the low moisture content and the freeze-dried samples sorbed water to reach equilibrium. When the first set of samples was initially exposed to high- and then to low-humidity conditions, they reached the same water content achieved by being equilibrated directly at the low-humidity condition. However, samples that were initially exposed to low- and then to high-humidity conditions had equilibrium water contents that were lower than those achieved by being equilibrated directly at the high-humidity condition. The original MCC systems exhibit a hysteretic effect above 85%, whereas the freeze-dried systems have a broader range hysteretic effect starting at 20% relative humidity. The results suggest that the internal structure of the MCC polymer fibers must change with the sorption and desorption of water, supporting the autohesion theory.

  12. Technical note: Sampling and processing of mesocosm sediment trap material for quantitative biogeochemical analysis

    NASA Astrophysics Data System (ADS)

    Boxhammer, Tim; Bach, Lennart T.; Czerny, Jan; Riebesell, Ulf

    2016-05-01

    Sediment traps are the most common tool to investigate vertical particle flux in the marine realm. However, the spatial and temporal decoupling between particle formation in the surface ocean and particle collection in sediment traps at depth often handicaps reconciliation of production and sedimentation even within the euphotic zone. Pelagic mesocosms are restricted to the surface ocean, but have the advantage of being closed systems and are therefore ideally suited to studying how processes in natural plankton communities influence particle formation and settling in the ocean's surface. We therefore developed a protocol for efficient sample recovery and processing of quantitatively collected pelagic mesocosm sediment trap samples for biogeochemical analysis. Sedimented material was recovered by pumping it under gentle vacuum through a silicon tube to the sea surface. The particulate matter of these samples was subsequently separated from bulk seawater by passive settling, centrifugation or flocculation with ferric chloride, and we discuss the advantages and efficiencies of each approach. After concentration, samples were freeze-dried and ground with an easy to adapt procedure using standard lab equipment. Grain size of the finely ground samples ranged from fine to coarse silt (2-63 µm), which guarantees homogeneity for representative subsampling, a widespread problem in sediment trap research. Subsamples of the ground material were perfectly suitable for a variety of biogeochemical measurements, and even at very low particle fluxes we were able to get a detailed insight into various parameters characterizing the sinking particles. The methods and recommendations described here are a key improvement for sediment trap applications in mesocosms, as they facilitate the processing of large amounts of samples and allow for high-quality biogeochemical flux data.

  13. Graphics processing unit (GPU) implementation of image processing algorithms to improve system performance of the control acquisition, processing, and image display system (CAPIDS) of the micro-angiographic fluoroscope (MAF)

    NASA Astrophysics Data System (ADS)

    Swetadri Vasan, S. N.; Ionita, Ciprian N.; Titus, A. H.; Cartwright, A. N.; Bednarek, D. R.; Rudin, S.

    2012-03-01

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  14. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF).

    PubMed

    Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S

    2012-02-23

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  15. Defense Systems Management Review. Volume 2, Number 4, Autumn 1979. Defense Acquisition: The Process and the Problems.

    DTIC Science & Technology

    1979-01-01

    acquisition policies, organizational structures, and management philosophies. Past successes and failures must be considered in selecting the methodology ...hibited because a contractor may not come forward with company secrets if there is a danger of technical transfusion in a multiple award competitive...Some companies are just plain better than others are. I don’t know what we can do about it. ൠ The real world competitive environment does not lean

  16. Defense Systems Management Review. Volume 3, Number 3, Summer 1980. Maturing of the DoD Acquisition Process.

    DTIC Science & Technology

    1980-01-01

    economic equity among the participants. Dr. Walter B. LaBerge , former Assistant Secretary General for Defense Sup- port, NATO, and now Deputy Under...1977. 34. Thomas A. Callaghan, President, Export-Import Technology, Inc., Washington, D.C. 35. Dr. Walter B. LaBerge , "A Concept of a Two-Way Street...one single activity-the production and acquisition : ’. 1. Steven Rosen, Testing the Theory of the Military-Industrial Complex (Lexington, Mass.: D. C

  17. Collection Methods and Laboratory Processing of Samples from Donnelly Training Area Firing Points, Alaska, 2003

    DTIC Science & Technology

    2005-03-01

    centration and variance in 10-g subsamples from 500-g samples of Ottawa sand spiked with either a fiber of M1 propellant or grains of SARM 2,4-DNT...Material ( SARM ) 2,4-DNT. Each spiked sample was ground on the ring mill for 60 s and twelve 10-g subsamples taken for analysis. Then the remainder of the...totaling less than 1 mg) of 2,4-DNT (Standard Analytical Refer- ence Material [ SARM ]) to another 500 g of Ottawa sand and processed the sand in the

  18. Collection Methods and Laboratory Processing of Samples From Donnelly Training Area Firing Points, Alaska, 2003

    DTIC Science & Technology

    2005-03-01

    centration and variance in 10-g subsamples from 500-g samples of Ottawa sand spiked with either a fiber of M1 propellant or grains of SARM 2,4-DNT...Material ( SARM ) 2,4-DNT. Each spiked sample was ground on the ring mill for 60 s and twelve 10-g subsamples taken for analysis. Then the remainder of the...totaling less than 1 mg) of 2,4-DNT (Standard Analytical Refer- ence Material [ SARM ]) to another 500 g of Ottawa sand and processed the sand in the

  19. The influence of type and token frequency on the acquisition of affixation patterns: implications for language processing.

    PubMed

    Endress, Ansgar D; Hauser, Marc D

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number of exception tokens relative to the number of regular tokens). We familiarized participants to either a prefixation pattern (where regulars started with /ZaI/ and exceptions ended with /ZaI/) or a suffixation pattern (where regulars ended with /ZaI/ and exceptions started with /ZaI/). We show that the type and the token frequency of regular items and exceptions influence in different ways what participants can learn. For the exceptions to be learned, they have to occur sufficiently often so that participants can memorize them; this can be achieved by a high token frequency. However, a high token frequency of the exceptions also impaired the acquisition of the regular pattern. In contrast, the type frequency of the patterns seemed to determine whether the regular pattern could be learned: When the type frequency of the regular items was sufficiently high, participants successfully learned the regular pattern even when the exceptions were played so often that 66% of the familiarization items were exceptions. We discuss these findings in the context of general learning mechanisms and the role they may play in language acquisition.

  20. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    PubMed

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  1. Calculating of river water quality sampling frequency by the analytic hierarchy process (AHP).

    PubMed

    Do, Huu Tuan; Lo, Shang-Lien; Phan Thi, Lan Anh

    2013-01-01

    River water quality sampling frequency is an important aspect of the river water quality monitoring network. A suitable sampling frequency for each station as well as for the whole network will provide a measure of the real water quality status for the water quality managers as well as the decision makers. The analytic hierarchy process (AHP) is an effective method for decision analysis and calculation of weighting factors based on multiple criteria to solve complicated problems. This study introduces a new procedure to design river water quality sampling frequency by applying the AHP. We introduce and combine weighting factors of variables with the relative weights of stations to select the sampling frequency for each station, monthly and yearly. The new procedure was applied for Jingmei and Xindian rivers, Taipei, Taiwan. The results showed that sampling frequency should be increased at high weighted stations while decreased at low weighted stations. In addition, a detailed monitoring plan for each station and each month could be scheduled from the output results. Finally, the study showed that the AHP is a suitable method to design a system for sampling frequency as it could combine multiple weights and multiple levels for stations and variables to calculate a final weight for stations, variables, and months.

  2. Modeling and analyzing respondent-driven sampling as a counting process.

    PubMed

    Berchenko, Yakir; Rosenblatt, Jonathan D; Frost, Simon D W

    2017-03-03

    Respondent-driven sampling (RDS) is an approach to sampling design and analysis which utilizes the networks of social relationships that connect members of the target population, using chain-referral. RDS sampling will typically oversample participants with many acquaintances. Naïve estimators, such as the sample average, will thus be biased towards the state of the most highly connected individuals. Current methodology cannot estimate population size from RDS, and promotes inverse probability weighted estimators for population parameters such as HIV prevalence. We propose to use the timing of recruitment, typically collected and discarded, in order to estimate the population size via a counting process model. Once population size and degree frequencies are made available, prevalence can be debiased in a post-stratified framework. We adapt methods developed for inference in epidemiology and software reliability to estimate the population size, degree counts and frequencies. A fundamental advantage of our approach is that it makes the assumptions of the sampling design explicit. This enables verification of the assumptions, maximum likelihood estimation, extension with covariates, and model selection. We develop large-sample theory, proving consistency and asymptotic normality. We further compare our estimators to other estimators in the RDS literature, through simulation and real-world data. In both cases, we find our estimators to outperform current methods. The likelihood problem in the model we present is separable, and thus efficiently solvable. We implement these estimators in an accompanying R package, chords, available on CRAN.

  3. Rotorcraft flight dynamics and control in wind for autonomous sampling of spatiotemporal processes

    NASA Astrophysics Data System (ADS)

    Sydney, Nitin

    In recent years, there has been significant effort put into the design and use small, autonomous, multi-agent, aerial teams for a variety of military and commercial applications. In particular, small multi-rotor systems have been shown to be especially useful for carrying sensors as they have the ability to rapidly transit between locations as well as hover in place. This dissertation seeks to use multi-agent teams of autonomous rotorcraft to sample spatiotemporal fields in windy conditions. For many sampling objectives, there is the problem of how to accomplish the sampling objective in the presence of strong wind fields caused by external means or by other rotorcraft flying in close proximity. This dissertation develops several flight control strategies for both wind compensation, using nonlinear control techniques, and wind avoidance, using artificial potential-based control. To showcase the utility of teams of unmanned rotorcraft for spatiotemporal sampling, optimal algorithms are developed for two sampling objectives: (1) sampling continuous spatiotemporal fields modeled as Gaussian processes, and (2) optimal motion planning for coordinated target detection, which is an example of a discrete spatiotemporal field. All algorithms are tested in simulation and several are tested in a motion capture based experimental testbed.

  4. A Study to Validate a Sample Set of Questions and the General Approach to Their Development for an Army Systems Acquisition Review Council (ASARC) 3 System

    DTIC Science & Technology

    1984-01-01

    him in this role. An ad hoc working group (AHWG) is formed 10 to 12 months prior to an ASARC to review the status of the system undergoing review and...ACQUISITION STATUS AHWG ESTABLISHED DAPR PRE ASARC ASARC III DSARC III * ASARC H MANNING - ESTIMATE: OFFICER WARRANT OFFICER ENLISTED...identify with confidence the MPT questions which need to be addressed to provide a clear understanding of the MPT status regarding PLRS. ; ! I f

  5. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion method was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U data, which is also

  6. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  7. Visibility graph analysis for re-sampled time series from auto-regressive stochastic processes

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Zou, Yong; Zhou, Jie; Gao, Zhong-Ke; Guan, Shuguang

    2017-01-01

    Visibility graph (VG) and horizontal visibility graph (HVG) play a crucial role in modern complex network approaches to nonlinear time series analysis. However, depending on the underlying dynamic processes, it remains to characterize the exponents of presumably exponential degree distributions. It has been recently conjectured that there is a critical value of exponent λc = ln 3 / 2 , which separates chaotic from correlated stochastic processes. Here, we systematically apply (H)VG analysis to time series from autoregressive (AR) models, which confirms the hypothesis that an increased correlation length results in larger values of λ > λc. On the other hand, we numerically find a regime of negatively correlated process increments where λ < λc, which is in contrast to this hypothesis. Furthermore, by constructing graphs based on re-sampled time series, we find that network measures show non-trivial dependencies on the autocorrelation functions of the processes. We propose to choose the decorrelation time as the maximal re-sampling delay for the algorithm. Our results are detailed for time series from AR(1) and AR(2) processes.

  8. An Analysis of Army Rapid Acquisition

    DTIC Science & Technology

    2015-09-01

    Strategy ........54 6. Value Center Six: Life Cycle Costs ............................................54 7. Value Center Seven: Lessons Learned...information assurance strategy ANPP Apple new product process AOA analysis of alternatives APB acquisition program baseline APC acquisition program...candidate APO Army Project Office APUC average procurement unit cost AS acquisition strategy ASOM acquisition system operating model BBP better

  9. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  10. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    NASA Astrophysics Data System (ADS)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  11. Microwave Processing for Sample Preparation to Evaluate Mitochondrial Ultrastructural Damage in Hemorrhagic Shock

    NASA Astrophysics Data System (ADS)

    Josephsen, Gary D.; Josephsen, Kelly A.; Beilman, Greg J.; Taylor, Jodie H.; Muiler, Kristine E.

    2005-12-01

    This is a report of the adaptation of microwave processing in the preparation of liver biopsies for transmission electron microscopy (TEM) to examine ultrastructural damage of mitochondria in the setting of metabolic stress. Hemorrhagic shock was induced in pigs via 35% total blood volume bleed and a 90-min period of shock followed by resuscitation. Hepatic biopsies were collected before shock and after resuscitation. Following collection, biopsies were processed for TEM by a rapid method involving microwave irradiation (Giberson, 2001). Samples pre- and postshock of each of two animals were viewed and scored using the mitochondrial ultrastructure scoring system (Crouser et al., 2002), a system used to quantify the severity of ultrastructural damage during shock. Results showed evidence of increased ultrastructural damage in the postshock samples, which scored 4.00 and 3.42, versus their preshock controls, which scored 1.18 and 1.27. The results of this analysis were similar to those obtained in another model of shock (Crouser et al., 2002). However, the amount of time used to process the samples was significantly shortened with methods involving microwave irradiation.

  12. The impact of fecal sample processing on prevalence estimates for antibiotic-resistant Escherichia coli.

    PubMed

    Omulo, Sylvia; Lofgren, Eric T; Mugoh, Maina; Alando, Moshe; Obiya, Joshua; Kipyegon, Korir; Kikwai, Gilbert; Gumbi, Wilson; Kariuki, Samuel; Call, Douglas R

    2017-05-01

    Investigators often rely on studies of Escherichia coli to characterize the burden of antibiotic resistance in a clinical or community setting. To determine if prevalence estimates for antibiotic resistance are sensitive to sample handling and interpretive criteria, we collected presumptive E. coli isolates (24 or 95 per stool sample) from a community in an urban informal settlement in Kenya. Isolates were tested for susceptibility to nine antibiotics using agar breakpoint assays and results were analyzed using generalized linear mixed models. We observed a <3-fold difference between prevalence estimates based on freshly isolated bacteria when compared to isolates collected from unprocessed fecal samples or fecal slurries that had been stored at 4°C for up to 7days. No time-dependence was evident (P>0.1). Prevalence estimates did not differ for five distinct E. coli colony morphologies on MacConkey agar plates (P>0.2). Successive re-plating of samples for up to five consecutive days had little to no impact on prevalence estimates. Finally, culturing E. coli under different conditions (with 5% CO2 or micro-aerobic) did not affect estimates of prevalence. For the conditions tested in these experiments, minor modifications in sample processing protocols are unlikely to bias estimates of the prevalence of antibiotic-resistance for fecal E. coli.

  13. Microstructural evolution in multiseeded YBCO bulk samples grown by the TSMG process

    NASA Astrophysics Data System (ADS)

    Goodfellow, A.; Shi, Y.-H.; Durrell, J. H.; Dennis, A. R.; Cardwell, D. A.; Grovenor, C. R. M.; Speller, S. C.

    2016-11-01

    Superconducting single-grain YBCO bulk samples with the ability to trap high magnetic fields can be grown using the top-seeded melt-growth process. Multiseeding techniques have the potential to enable larger diameter bulks to be grown, but the performance of these materials is not yet comparable to the single-seeded bulks. Here we carry out detailed three-dimensional microstructural characterisation on a multiseeded sample grown with the seeds aligned in the 0°-0° geometry using high resolution microanalysis techniques. Chemical and structural variations have been correlated with the trapped field distribution in three separate slices of the sample. The top slice of the sample shows four peaks in trapped field, indicating that the current flows in four separate loops rather than in one large loop within the sample. This has been explained by the build-up in insulating Y-211 particles where the growth fronts from the two seeds meet, forming a barrier to current flow, as well as the low Y-211 content (and hence low J c) of the large c-axis growth sector.

  14. Description of the ACCESS data acquisition system

    SciTech Connect

    Treichel, B.A.; Koehl, E.R.

    1983-09-01

    The ACCESS data acquisition system is designed to acquire, process, and store samples of analog data produced by the Reversing Flow Test Apparatus. Data acquisition requires minimal interaction with the user, being governed primarily by trigger pulses generated in a shaft encoder coupled to a camshaft in the apparatus. A complete scan of 32 input data channels is made 18 times per camshaft revolution at speeds up to 30 revolutions per second. At higher speeds, up to the maximum apparatus speed of 50 revolutions per second, data channels are scanned 9 times per revolution. The programs for this data acquisition system are written for the Hewlett-Packard MCU 2250 measurement and control processor, operating in conjunction with an HP-1000 system computer.

  15. Evaluation of SRAT Sampling Data in Support of a Six Sigma Yellow Belt Process Improvement Project

    SciTech Connect

    Edwards, Thomas B.

    2005-06-01

    As part of the Six Sigma continuous improvement initiatives at the Defense Waste Processing Facility (DWPF), a Yellow Belt team was formed to evaluate the frequency and types of samples required for the Sludge Receipt and Adjustment Tank (SRAT) receipt in the DWPF. The team asked, via a technical task request, that the Statistical Consulting Section (SCS), in concert with the Immobilization Technology Section (ITS) (both groups within the Savannah River National Laboratory (SRNL)), conduct a statistical review of recent SRAT receipt results to determine if there is enough consistency in these measurements to allow for less frequent sampling. As part of this review process, key decisions made by DWPF Process Engineering that are based upon the SRAT sample measurements are outlined in this report. For a reduction in SRAT sampling to be viable, these decisions must not be overly sensitive to the additional variation that will be introduced as a result of such a reduction. Measurements from samples of SRAT receipt batches 314 through 323 were reviewed as part of this investigation into the frequency of SRAT sampling. The associated acid calculations for these batches were also studied as part of this effort. The results from this investigation showed no indication of a statistically significant relationship between the tank solids and the acid additions for these batches. One would expect that as the tank solids increase there would be a corresponding increase in acid requirements. There was, however, an indication that the predicted reduction/oxidation (REDOX) ratio (the ratio of Fe{sup 2+} to the total Fe in the glass product) that was targeted by the acid calculations based on the SRAT receipt samples for these batches was on average 0.0253 larger than the predicted REDOX based upon Slurry Mix Evaporator (SME) measurements. This is a statistically significant difference (at the 5% significance level), and the study also suggested that the difference was due to

  16. Assessment of toxic metals in raw and processed milk samples using electrothermal atomic absorption spectrophotometer.

    PubMed

    Kazi, Tasneem Gul; Jalbani, Nusrat; Baig, Jameel Ahmed; Kandhro, Ghulam Abbas; Afridi, Hassan Imran; Arain, Mohammad Balal; Jamali, Mohammad Khan; Shah, Abdul Qadir

    2009-09-01

    Milk and dairy products have been recognized all over the world for their beneficial influence on human health. The levels of toxic metals (TMs) are an important component of safety and quality of milk. A simple and efficient microwave assisted extraction (MAE) method has been developed for the determination of TMs (Al, Cd, Ni and Pb), in raw and processed milk samples. A Plackett-Burman experimental design and 2(3)+star central composite design, were applied in order to determine the optimum conditions for MAE. Concentrations of TMs were measured by electrothermal atomic absorption spectrometry. The accuracy of the optimized procedure was evaluated by standard addition method and conventional wet acid digestion method (CDM), for comparative purpose. No significant differences were observed (P>0.05), when comparing the values obtained by the proposed MAE method and CDM (paired t-test). The average relative standard deviation of the MAE method varied between 4.3% and 7.6% based on analyte (n=6). The proposed method was successfully applied for the determination of understudy TMs in milk samples. The results of raw and processed milk indicated that environmental conditions and manufacturing processes play a key role in the distribution of toxic metals in raw and processed milk.

  17. Endophytic bacterial community of grapevine leaves influenced by sampling date and phytoplasma infection process

    PubMed Central

    2014-01-01

    Background Endophytic bacteria benefit host plant directly or indirectly, e.g. by biocontrol of the pathogens. Up to now, their interactions with the host and with other microorganisms are poorly understood. Consequently, a crucial step for improving the knowledge of those relationships is to determine if pathogens or plant growing season influence endophytic bacterial diversity and dynamic. Results Four healthy, four phytoplasma diseased and four recovered (symptomatic plants that spontaneously regain a healthy condition) grapevine plants were sampled monthly from June to October 2010 in a vineyard in north-western Italy. Metagenomic DNA was extracted from sterilized leaves and the endophytic bacterial community dynamic and diversity were analyzed by taxon specific real-time PCR, Length-Heterogeneity PCR and genus-specific PCR. These analyses revealed that both sampling date and phytoplasma infection influenced the endophytic bacterial composition. Interestingly, in June, when the plants are symptomless and the pathogen is undetectable (i) the endophytic bacterial community associated with diseased grapevines was different from those in the other sampling dates, when the phytoplasmas are detectable inside samples; (ii) the microbial community associated with recovered plants differs from that living inside healthy and diseased plants. Interestingly, LH-PCR database identified bacteria previously reported as biocontrol agents in the examined grapevines. Of these, Burkholderia, Methylobacterium and Pantoea dynamic was influenced by the phytoplasma infection process and seasonality. Conclusion Results indicated that endophytic bacterial community composition in grapevine is correlated to both phytoplasma infection and sampling date. For the first time, data underlined that, in diseased plants, the pathogen infection process can decrease the impact of seasonality on community dynamic. Moreover, based on experimental evidences, it was reasonable to hypothesize that

  18. Data Acquisition for Modular Biometric Monitoring System

    NASA Technical Reports Server (NTRS)

    Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor); Grodsinsky, Carlos M. (Inventor)

    2014-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.

  19. Improvement of microtome cutting process of carbon nanotube composite sample preparation for TEM analysis

    NASA Astrophysics Data System (ADS)

    Trayner, Sarah

    As research progresses towards nanoscale materials, there has become a need for a more efficient and effective way to obtain ultra-thin samples for imaging under transmission electron microscope (TEM) for atomic resolution analysis. There are various methods used to obtain thin samples (<50 nm in thickness). However, most of the resultant TEM images of soft materials, such as CNT/epoxy composites, are of poor quality due to the sample cutting difficulties. Such poor quality samples are characterized by uneven sample thicknesses, objective overlapping, overall darkness due to large thickness, and defects such as cutting scratches. This research is a continuous effort to study and improve the ultra-microtome cutting technique to provide an effective and reliable approach of obtaining an ultra-thin (25-50 nm) cross section of a CNT/polymer composite for high resolution TEM analysis. Improvements were achieved by studying the relationships between the chosen cutting parameters, sample characteristics and TEM image quality. From this information, a cutting protocol was established so that ultra-thin sample slices can be achieved by different microtome operators for high resolution TEM analysis. In addition, a custom tool was created to aid in the sample collection process. In this research, three composite samples were studied for both microtome cutting and TEM analysis: 1) Unidirectional (UD) IM7/BMI composite; 2) Single-layer CNT buckypaper (BP)/epoxy nanocomposite; 3) 3-layer CNT BP/BMI nanocomposite. The resultant TEM images revealed a clear microstructure consisting of amorphous resin and graphite crystalline packing. UD IM7/BMI composite TEM results did not reveal an interfacial region resulting in a need for even thinner sliced cross sections. TEM results for the single-layer CNT BP/epoxy nanocomposite revealed the alignment direction of the nanotubes and numerous stacks of CNT bundles. In addition, there was visible flattening of CNT packing into dumbbell shapes

  20. Seabed observation & sampling system

    USGS Publications Warehouse

    Blackwood, D.; Parolski, K.

    2001-01-01

    SEABOSS has proved to be a valuable addition to the USGS data-acquisition and processing field program. It has allowed researchers to collect high-quality images and seabed samples in a timely manner. It is a simple, dependable and trouble-free system with a track record of over 3,000 deployments. When used as part of the USGS seafloor mapping acquisition, processing, and ground-truth program, SEABOSS has been invaluable in providing information quickly and efficiently, with a minimum of downtime. SEABOSS enables scientists to collect high-quality images and samples of the seabed, essential to the study of sedimentary environments and biological habitats and to the interpretation of side-scan sonar and multibeam imagery, the most common tools for mapping the seabed.

  1. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  2. 40 CFR Table 9 to Subpart Hhhhhhh... - Procedures for Conducting Sampling of Stripped Resin and Process Wastewater

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Stripped Resin and Process Wastewater 9 Table 9 to Subpart HHHHHHH of Part 63 Protection of Environment... Wastewater For demonstrating . . . For the following emission points and types of processes . . . Collect.... Each process wastewater stream 3. Initial compliance N/A 1 grab sample 1 grab sample. 4....

  3. 40 CFR Table 9 to Subpart Hhhhhhh... - Procedures for Conducting Sampling of Stripped Resin and Process Wastewater

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Stripped Resin and Process Wastewater 9 Table 9 to Subpart HHHHHHH of Part 63 Protection of Environment... Wastewater For demonstrating . . . For the following emission points and types of processes . . . Collect.... Each process wastewater stream 3. Initial compliance N/A 1 grab sample 1 grab sample. 4....

  4. Planetary protection, legal ambiguity and the decision making process for Mars sample return

    NASA Technical Reports Server (NTRS)

    Race, M. S.

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  5. Planetary protection, legal ambiguity and the decision making process for Mars sample return.

    PubMed

    Race, M S

    1996-01-01

    As scientists and mission planners develop planetary protection requirements for future Mars sample return missions, they must recognize the socio-political context in which decisions about the mission will be made and pay careful attention to public concerns about potential back contamination of Earth. To the extent that planetary protection questions are unresolved or unaddressed at the time of an actual mission, they offer convenient footholds for public challenges in both legal and decision making realms, over which NASA will have little direct control. In this paper, two particular non-scientific areas of special concern are discussed in detail: 1) legal issues and 2) the decision making process. Understanding these areas is critical for addressing legitimate public concerns as well as for fulfilling procedural requirements regardless whether sample return evokes public controversy. Legal issues with the potential to complicate future missions include: procedural review under National Environmental Policy Act (NEPA); uncertainty about institutional control and authority; conflicting regulations and overlapping jurisdictions; questions about international treaty obligations and large scale impacts; uncertanities about the nature of the organism; and constitutional and regulatory concerns about quarantine, public health and safety. In light of these important legal issues, it is critical that NASA consider the role and timing of public involvement in the decision making process as a way of anticipating problem areas and preparing for legitimate public questions and challenges to sample return missions.

  6. A general theory of the sampling process with applications to the "veil line".

    PubMed

    Dewdney, A K

    1998-12-01

    When a community of species is sampled, nonappearing species are not those with abundances that fall shy of some arbitrary mark, the "veil line" proposed by E. F. Preston in 1948 (Ecology 29, 254-283). Instead, they follow a hypergeometric distribution, which has no resemblance to the veil line. There is therefore no justification for the truncation of distributions proposed to describe the abundances of species in natural communities. The mistake of the veil line points to the need for a general theory of sampling. If a community has a distribution g of species abundances and if samples taken of the community tend to follow distribution f, what is the relationship of f to g? The seeds of such a theory are available in the work of E. C. Pielou. Using the Poisson distribution as a close approximation to the hypergeometric, one may immediately write and (in most cases) solve the transformation from g to f. The transformation appears to preserve distribution formulas to within constants and parameters, providing yet another reason to rule out the use of truncation. Well beyond this application, the theory provides a foundation for rethinking the sampling process and its implications for ecology.

  7. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, Joseph; May, Christopher P.; Pemberton, Bradley E.; Shinn, Jim; Sprague, Keith

    1999-01-01

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus.

  8. Chemical properties of hydroxyapatite deposited through electrophoretic process on different sandblasted samples

    NASA Astrophysics Data System (ADS)

    Gradinariu, Irina; Stirbu, Ioan; Gheorghe, Cristina Angela; Cimpoesu, Nicanor; Agop, Maricel; Cimpoesu, Ramona; Popa, Cristina

    2014-12-01

    An implantable material based on titanium (Ti6Al4V) was sandblasted in order to be deposited with a thin film of hydroxyapatite. Two samples of the alloy, in a shape of a bar with 10 mm diameter and 20 mm length, were subjected to mechanical treatment. After deposition of the hydroxyapatite through electrophoresis process, the samples were analyzed by scanning electron microscopy. The nature and chemical properties of thin films formed on Ti-based substrate were investigated with electrochemical impedance spectroscopy based on the extremely high polarization resistance of the material. The results revealed the formation of a homogeneous layer on the surface of the metallic substrate. The layer composed of TiO2 and hydroxyapatite provided a high corrosion protection.

  9. Process and apparatus for obtaining samples of liquid and gas from soil

    DOEpatents

    Rossabi, J.; May, C.P.; Pemberton, B.E.; Shinn, J.; Sprague, K.

    1999-03-30

    An apparatus and process for obtaining samples of liquid and gas from subsurface soil is provided having filter zone adjacent an external expander ring. The expander ring creates a void within the soil substrate which encourages the accumulation of soil-borne fluids. The fluids migrate along a pressure gradient through a plurality of filters before entering a first chamber. A one-way valve regulates the flow of fluid into a second chamber in further communication with a collection tube through which samples are collected at the surface. A second one-way valve having a reverse flow provides additional communication between the chambers for the pressurized cleaning and back-flushing of the apparatus. 8 figs.

  10. Chemical process to separate iron oxides particles in pottery sample for EPR dating.

    PubMed

    Watanabe, S; Farias, T M B; Gennari, R F; Ferraz, G M; Kunzli, R; Chubaci, J F D

    2008-12-15

    Ancient potteries usually are made of the local clay material, which contains relatively high concentration of iron. The powdered samples are usually quite black, due to magnetite, and, although they can be used for thermoluminescene (TL) dating, it is easiest to obtain better TL reading when clearest natural or pre-treated sample is used. For electron paramagnetic resonance (EPR) measurements, the huge signal due to iron spin-spin interaction, promotes an intense interference overlapping any other signal in this range. Sample dating is obtained by dividing the radiation dose, determined by the concentration of paramagnetic species generated by irradiation, by the natural dose so as a consequence, EPR dating cannot be used, since iron signal do not depend on radiation dose. In some cases, the density separation method using hydrated solution of sodium polytungstate [Na6(H2W12O40).H2O] becomes useful. However, the sodium polytungstate is very expensive in Brazil; hence an alternative method for eliminating this interference is proposed. A chemical process to eliminate about 90% of magnetite was developed. A sample of powdered ancient pottery was treated in a mixture (3:1:1) of HCl, HNO(3) and H(2)O(2) for 4h. After that, it was washed several times in distilled water to remove all acid matrixes. The original black sample becomes somewhat clearer. The resulting material was analyzed by plasma mass spectrometry (ICP-MS), with the result that the iron content is reduced by a factor of about 9. In EPR measurements a non-treated natural ceramic sample shows a broad spin-spin interaction signal, the chemically treated sample presents a narrow signal in g=2.00 region, possibly due to a radical of (SiO(3))(3-), mixed with signal of remaining iron [M. Ikeya, New Applications of Electron Spin Resonance, World Scientific, Singapore, 1993, p. 285]. This signal increases in intensity under gamma-irradiation. However, still due to iron influence, the additive method yielded too

  11. Chemical process to separate iron oxides particles in pottery sample for EPR dating

    NASA Astrophysics Data System (ADS)

    Watanabe, S.; Farias, T. M. B.; Gennari, R. F.; Ferraz, G. M.; Kunzli, R.; Chubaci, J. F. D.

    2008-12-01

    Ancient potteries usually are made of the local clay material, which contains relatively high concentration of iron. The powdered samples are usually quite black, due to magnetite, and, although they can be used for thermoluminescene (TL) dating, it is easiest to obtain better TL reading when clearest natural or pre-treated sample is used. For electron paramagnetic resonance (EPR) measurements, the huge signal due to iron spin-spin interaction, promotes an intense interference overlapping any other signal in this range. Sample dating is obtained by dividing the radiation dose, determined by the concentration of paramagnetic species generated by irradiation, by the natural dose so as a consequence, EPR dating cannot be used, since iron signal do not depend on radiation dose. In some cases, the density separation method using hydrated solution of sodium polytungstate [Na 6(H 2W 12O 40)·H 2O] becomes useful. However, the sodium polytungstate is very expensive in Brazil; hence an alternative method for eliminating this interference is proposed. A chemical process to eliminate about 90% of magnetite was developed. A sample of powdered ancient pottery was treated in a mixture (3:1:1) of HCl, HNO 3 and H 2O 2 for 4 h. After that, it was washed several times in distilled water to remove all acid matrixes. The original black sample becomes somewhat clearer. The resulting material was analyzed by plasma mass spectrometry (ICP-MS), with the result that the iron content is reduced by a factor of about 9. In EPR measurements a non-treated natural ceramic sample shows a broad spin-spin interaction signal, the chemically treated sample presents a narrow signal in g = 2.00 region, possibly due to a radical of (SiO 3) 3-, mixed with signal of remaining iron [M. Ikeya, New Applications of Electron Spin Resonance, World Scientific, Singapore, 1993, p. 285]. This signal increases in intensity under γ-irradiation. However, still due to iron influence, the additive method yielded too

  12. L. monocytogenes in a cheese processing facility: Learning from contamination scenarios over three years of sampling.

    PubMed

    Rückerl, I; Muhterem-Uyar, M; Muri-Klinger, S; Wagner, K-H; Wagner, M; Stessl, B

    2014-10-17

    The aim of this study was to analyze the changing patterns of Listeria monocytogenes contamination in a cheese processing facility manufacturing a wide range of ready-to-eat products. Characterization of L. monocytogenes isolates included genotyping by pulsed-field gel electrophoresis (PFGE) and multi-locus sequence typing (MLST). Disinfectant-susceptibility tests and the assessment of L. monocytogenes survival in fresh cheese were also conducted. During the sampling period between 2010 and 2013, a total of 1284 environmental samples were investigated. Overall occurrence rates of Listeria spp. and L. monocytogenes were 21.9% and 19.5%, respectively. Identical L. monocytogenes genotypes were found in the food processing environment (FPE), raw materials and in products. Interventions after the sampling events changed contamination scenarios substantially. The high diversity of globally, widely distributed L. monocytogenes genotypes was reduced by identifying the major sources of contamination. Although susceptible to a broad range of disinfectants and cleaners, one dominant L. monocytogenes sequence type (ST) 5 could not be eradicated from drains and floors. Significantly, intense humidity and steam could be observed in all rooms and water residues were visible on floors due to increased cleaning strategies. This could explain the high L. monocytogenes contamination of the FPE (drains, shoes and floors) throughout the study (15.8%). The outcome of a challenge experiment in fresh cheese showed that L. monocytogenes could survive after 14days of storage at insufficient cooling temperatures (8 and 16°C). All efforts to reduce L. monocytogenes environmental contamination eventually led to a transition from dynamic to stable contamination scenarios. Consequently, implementation of systematic environmental monitoring via in-house systems should either aim for total avoidance of FPE colonization, or emphasize a first reduction of L. monocytogenes to sites where

  13. Sample Processing Impacts the Viability and Cultivability of the Sponge Microbiome

    PubMed Central

    Esteves, Ana I. S.; Amer, Nimra; Nguyen, Mary; Thomas, Torsten

    2016-01-01

    Sponges host complex microbial communities of recognized ecological and biotechnological importance. Extensive cultivation efforts have been made to isolate sponge bacteria, but most still elude cultivation. To identify the bottlenecks of sponge bacterial cultivation, we combined high-throughput 16S rRNA gene sequencing with a variety of cultivation media and incubation conditions. We aimed to determine the extent to which sample processing and cultivation conditions can impact bacterial viability and recovery in culture. We isolated 325 sponge bacteria from six specimens of Cymbastela concentrica and three specimens of Scopalina sp. These isolates were distributed over 37 different genera and 47 operational taxonomic units (defined at 97% 16S rRNA gene sequence identity). The cultivable bacterial community was highly specific to its sponge host and different media compositions yielded distinct microbial isolates. Around 97% of the isolates could be detected in the original sponge and represented a large but highly variable proportion (0.5–92% total abundance, depending on sponge species) of viable bacteria obtained after sample processing, as determined by propidium monoazide selective DNA modification of compromised cells. Our results show that the most abundant viable bacteria are also the most predominant groups found in cultivation, reflecting, to some extent, the relative abundances of the viable bacterial community, rather than the overall community estimated by direct molecular approaches. Cultivation is therefore shaped not only by the growth conditions provided, but also by the different cell viabilities of the bacteria that constitute the cultivation inoculum. These observations highlight the need to perform experiments to assess each method of sample processing for its accurate representation of the actual in situ bacterial community and its yield of viable cells. PMID:27242673

  14. Post-acquisition data processing for the screening of transformation products of different organic contaminants. Two-year monitoring of river water using LC-ESI-QTOF-MS and GCxGC-EI-TOF-MS.

    PubMed

    López, S Herrera; Ulaszewska, M M; Hernando, M D; Martínez Bueno, M J; Gómez, M J; Fernández-Alba, A R

    2014-11-01

    This study describes a comprehensive strategy for detecting and elucidating the chemical structures of expected and unexpected transformation products (TPs) from chemicals found in river water and effluent wastewater samples, using liquid chromatography coupled to electrospray ionization quadrupole-time-of-flight mass spectrometer (LC-ESI-QTOF-MS), with post-acquisition data processing and an automated search using an in-house database. The efficacy of the mass defect filtering (MDF) approach to screen metabolites from common biotransformation pathways was tested, and it was shown to be sufficiently sensitive and applicable for detecting metabolites in environmental samples. Four omeprazole metabolites and two venlafaxine metabolites were identified in river water samples. This paper reports the analytical results obtained during 2 years of monitoring, carried out at eight sampling points along the Henares River (Spain). Multiresidue monitoring, for targeted analysis, includes a group of 122 chemicals, amongst which are pharmaceuticals, personal care products, pesticides and PAHs. For this purpose, two analytical methods were used based on direct injection with a LC-ESI-QTOF-MS system and stir bar sorptive extraction (SBSE) with bi-dimensional gas chromatography coupled with a time-of-flight spectrometer (GCxGC-EI-TOF-MS).

  15. Directionally Solidified Aluminum - 7 wt% Silicon Alloys: Comparison of Earth and International Space Station Processed Samples

    NASA Technical Reports Server (NTRS)

    Grugel, Richard N,; Tewari, Surendra; Rajamure, R. S.; Erdman, Robert; Poirier, David

    2012-01-01

    Primary dendrite arm spacings of Al-7 wt% Si alloy directionally solidified in low gravity environment of space (MICAST-6 and MICAST-7: Thermal gradient approx. 19 to 26 K/cm, Growth speeds varying from 5 to 50 microns/s show good agreement with the Hunt-Lu model. Primary dendrite trunk diameters of the ISS processed samples show a good fit with a simple analytical model based on Kirkwood s approach, proposed here. Natural convection, a) decreases primary dendrite arm spacing. b) appears to increase primary dendrite trunk diameter.

  16. DEFENSE WASTE PROCESSING FACILITY ANALYTICAL METHOD VERIFICATION FOR THE SLUDGE BATCH 5 QUALIFICATION SAMPLE

    SciTech Connect

    Click, D; Tommy Edwards, T; Henry Ajo, H

    2008-07-25

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) performs confirmation of the applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples. DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem Method, see Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICP-AES). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 5 (SB5) SRAT Receipt and SB5 SRAT Product samples. The SB5 SRAT Receipt and SB5 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB5 Batch composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 4 (SB4), to form the SB5 Blend composition. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element in the sludge or used to estimate ratios of compounds in the sludge. A statistical comparison of the data validates the use of the DWPF CC method for SB5 Batch composition. However, the difficulty that was encountered in using the CC method for SB4 brings into question the adequacy of CC for the SB5 Blend. Also, it should be noted that visible solids remained in the final diluted solutions of all samples digested by this method at SRNL (8 samples total), which is typical for the DWPF CC method but not seen in the other methods. Recommendations to the DWPF for application to SB5 based on studies to date: (1) A dissolution study should be performed on the WAPS

  17. Contribution of working memory processes to relational matching-to-sample performance in baboons (Papio papio).

    PubMed

    Maugard, Anaïs; Marzouki, Yousri; Fagot, Joël

    2013-11-01

    Recent studies of monkeys and apes have shown that these animals can solve relational-matching-to-sample (RMTS) problems, suggesting basic abilities for analogical reasoning. However, doubts remain as to the actual cognitive strategies adopted by nonhuman primates in this task. Here, we used dual-task paradigms to test 10 baboons in the RMTS problem under three conditions of memory load. Our three test conditions allowed different predictions, depending on the strategy (i.e., flat memorization of the percept, reencoding of the percept, or relational processing) that they might use to solve RMTS problems. Results support the idea that the baboons process both the items and the abstract (same and different) relations in this task.

  18. Solar Ion Processing of Itokawa Grains: Reconciling Model Predictions with Sample Observations

    NASA Technical Reports Server (NTRS)

    Christoffersen, Roy; Keller, L. P.

    2014-01-01

    Analytical TEM observations of Itokawa grains reported to date show complex solar wind ion processing effects in the outer 30-100 nm of pyroxene and olivine grains. The effects include loss of long-range structural order, formation of isolated interval cavities or "bubbles", and other nanoscale compositional/microstructural variations. None of the effects so far described have, however, included complete ion-induced amorphization. To link the array of observed relationships to grain surface exposure times, we have adapted our previous numerical model for progressive solar ion processing effects in lunar regolith grains to the Itokawa samples. The model uses SRIM ion collision damage and implantation calculations within a framework of a constant-deposited-energy model for amorphization. Inputs include experimentally-measured amorphization fluences, a Pi steradian variable ion incidence geometry required for a rotating asteroid, and a numerical flux-versus-velocity solar wind spectrum.

  19. An efficient sampling algorithm for uncertain abnormal data detection in biomedical image processing and disease prediction.

    PubMed

    Liu, Fei; Zhang, Xi; Jia, Yan

    2015-01-01

    In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.

  20. Some results of processing NURE geochemical sampling in the northern Rocky Mountain area

    SciTech Connect

    Thayer, P.A.; Cook, J.R.; Price, V. Jr.

    1980-01-01

    The National Uranium Resource Evaluation (NURE) program was begun in the spring of 1973 to evaluate domestic uranium resources in the continental United States and to identify areas favorable for uranium exploration. The significance of the distribution of uranium in natural waters and sediments will be assessed as an indicator of favorable areas for the discovery of uranium deposits. This paper is oriented primarily to the discussion of stream sediments. Data for the Challis 1/sup 0/ x 2/sup 0/ NTMS quadrangle will be used for specific samples of NURE data processing. A high-capacity neutron activation analysis facility at SRL is used to determine uranium and about 19 other elements in hydrogeochemical samples. Evaluation of the areal distributions of uranium ratios demonstrate that most of the high U/Hf, U/Th and U/(Th + Hf) ratios occur scattered throughout the western two-thirds of the quadrangle. Most of the higher ratio values are found in samples taken at sites underlain by granitic rocks of the Idaho batholith or Tertiary-age plutons.