Science.gov

Sample records for acquisition processing analysis

  1. SNAP: Simulating New Acquisition Processes

    NASA Technical Reports Server (NTRS)

    Alfeld, Louis E.

    1997-01-01

    Simulation models of acquisition processes range in scope from isolated applications to the 'Big Picture' captured by SNAP technology. SNAP integrates a family of models to portray the full scope of acquisition planning and management activities, including budgeting, scheduling, testing and risk analysis. SNAP replicates the dynamic management processes that underlie design, production and life-cycle support. SNAP provides the unique 'Big Picture' capability needed to simulate the entire acquisition process and explore the 'what-if' tradeoffs and consequences of alternative policies and decisions. Comparison of cost, schedule and performance tradeoffs help managers choose the lowest-risk, highest payoff at each step in the acquisition process.

  2. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  3. Image acquisitions, processing and analysis in the process of obtaining characteristics of horse navicular bone

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Włodarek, J.; Przybylak, A.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Boniecki, P.; Koszela, K.; Przybył, J.; Skwarcz, J.

    2015-07-01

    The aim of this study was investigate the possibility of using methods of computer image analysis for the assessment and classification of morphological variability and the state of health of horse navicular bone. Assumption was that the classification based on information contained in the graphical form two-dimensional digital images of navicular bone and information of horse health. The first step in the research was define the classes of analyzed bones, and then using methods of computer image analysis for obtaining characteristics from these images. This characteristics were correlated with data concerning the animal, such as: side of hooves, number of navicular syndrome (scale 0-3), type, sex, age, weight, information about lace, information about heel. This paper shows the introduction to the study of use the neural image analysis in the diagnosis of navicular bone syndrome. Prepared method can provide an introduction to the study of non-invasive way to assess the condition of the horse navicular bone.

  4. Data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Tsuda, Toshitaka

    1989-10-01

    Fundamental methods of signal processing used in normal mesosphere stratosphere troposphere (MST) radar observations are described. Complex time series of received signals obtained in each range gate are converted into Doppler spectra, from which the mean Doppler shift, spectral width and signal-to-noise ratio (SNR) are estimated. These spectral parameters are further utilized to study characteristics of scatterers and atmospheric motions.

  5. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  6. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques.

    PubMed

    Pannek, Kerstin; Guzzetta, Andrea; Colditz, Paul B; Rose, Stephen E

    2012-10-01

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. PMID:22903761

  7. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  8. On the Contrastive Analysis of Features in Second Language Acquisition: Uninterpretable Gender on Past Participles in English-French Processing

    ERIC Educational Resources Information Center

    Dekydtspotter, Laurent; Renaud, Claire

    2009-01-01

    Lardiere's discussion raises important questions about the use of features in second language (L2) acquisition. This response examines predictions for processing of a feature-valuing model vs. a frequency-sensitive, associative model in explaining the acquisition of French past participle agreement. Results from a reading-time experiment support…

  9. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-01

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  10. Acquisition and processing pitfall with clipped traces in surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Gao, Lingli; Pan, Yudi

    2016-02-01

    Multichannel analysis of surface waves (MASW) is widely used in estimating near-surface shear (S)-wave velocity. In the MASW method, generating a reliable dispersion image in the frequency-velocity (f-v) domain is an important processing step. A locus along peaks of dispersion energy at different frequencies allows the dispersion curves to be constructed for inversion. When the offsets are short, the output seismic data may exceed the dynamic ranges of geophones/seismograph, as a result of which, peaks and (or) troughs of traces will be squared off in recorded shot gathers. Dispersion images generated by the raw shot gathers with clipped traces would be contaminated by artifacts, which might be misidentified as Rayleigh-wave phase velocities or body-wave velocities and potentially lead to incorrect results. We performed some synthetic models containing clipped traces, and analyzed amplitude spectra of unclipped and clipped waves. The results indicate that artifacts in the dispersion image are dependent on the level of clipping. A real-world example also shows how clipped traces would affect the dispersion image. All the results suggest that clipped traces should be removed from the shot gathers before generating dispersion images, in order to pick accurate phase velocities and set reasonable initial inversion models.

  11. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalisedcross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  12. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalised cross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  13. eL-Chem Viewer: A Freeware Package for the Analysis of Electroanalytical Data and Their Post-Acquisition Processing

    PubMed Central

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-01-01

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, ‘eL-Chem Viewer’, which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm. PMID:25090415

  14. eL-Chem Viewer: a freeware package for the analysis of electroanalytical data and their post-acquisition processing.

    PubMed

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-07-31

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, 'eL-Chem Viewer', which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm.

  15. Process data acquisition: real-time and historical interfaces

    NASA Astrophysics Data System (ADS)

    Rice, Gordon; Moreno, Richard; King, Michael S.

    1997-01-01

    With the advent of touch probe technology, it was discovered that current closed architecture controllers do not provide adequate resources to support the implementation of process data acquisition on the shop floor. At AlliedSignal, a process data acquisition systems has been developed for a flexible manufacturing system utilizing touch probe and customized software which allows fixture and cutting tool related information for an entire process to be captured and stored for off-line analysis. The implementation of this system, the difficulties and pitfalls, will be presented along with the functionality required for an open architecture controller to properly support process data acquisition.

  16. Process data acquisition: Real time and historical interfaces

    SciTech Connect

    Rice, G.; Moreno, R.; King, M.

    1996-11-01

    With the advent of touch probe technology, it was discovered that current closed architecture controllers do not provide adequate resources to support the implementation of process data acquisition on the shop floor. At AlliedSignal Federal Manufacturing & Technologies, a process data acquisition system has been developed for a flexible manufacturing system utilizing touch probes and customized software which allows fixture and cutting tool related information for an entire process to be captured and stored for off-line analysis. The implementation of this system, the difficulties and pitfalls, will be presented along with the functionality required for an open architecture controller to properly support process data acquisition.

  17. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  18. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  19. Processability Theory and German Case Acquisition

    ERIC Educational Resources Information Center

    Baten, Kristof

    2011-01-01

    This article represents the first attempt to formulate a hypothetical sequence for German case acquisition by Dutch-speaking learners on the basis of Processability Theory (PT). It will be argued that case forms emerge corresponding to a development from lexical over phrasal to interphrasal morphemes. This development, however, is subject to a…

  20. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  1. The Effectiveness of Processing Instruction and Production-Based Instruction on L2 Grammar Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2015-01-01

    This article reports a meta-analysis of 42 experiments in 33 published studies involving processing instruction (PI) and production-based instruction (PB) used in the PI studies. The comparative effectiveness of PI and PB showed that although PI was more effective than PB for developing receptive knowledge, PB was just as effective as PI for…

  2. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  3. Adaptive processing for enhanced target acquisition

    NASA Astrophysics Data System (ADS)

    Page, Scott F.; Smith, Moira I.; Hickman, Duncan; Bernhardt, Mark; Oxford, William; Watson, Norman; Beath, F.

    2009-05-01

    Conventional air-to-ground target acquisition processes treat the image stream in isolation from external data sources. This ignores information that may be available through modern mission management systems which could be fused into the detection process in order to provide enhanced performance. By way of an example relating to target detection, this paper explores the use of a-priori knowledge and other sensor information in an adaptive architecture with the aim of enhancing performance in decision making. The approach taken here is to use knowledge of target size, terrain elevation, sensor geometry, solar geometry and atmospheric conditions to characterise the expected spatial and radiometric characteristics of a target in terms of probability density functions. An important consideration in the construction of the target probability density functions are the known errors in the a-priori knowledge. Potential targets are identified in the imagery and their spatial and expected radiometric characteristics are used to compute the target likelihood. The adaptive architecture is evaluated alongside a conventional non-adaptive algorithm using synthetic imagery representative of an air-to-ground target acquisition scenario. Lastly, future enhancements to the adaptive scheme are discussed as well as strategies for managing poor quality or absent a-priori information.

  4. Commonality analysis as a knowledge acquisition problem

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1987-01-01

    Commonality analysis is a systematic attempt to reduce costs in a large scale engineering project by discontinuing development of certain components during the design phase. Each discontinued component is replaced by another component that has sufficient functionality to be considered an appropriate substitute. The replacement strategy is driven by economic considerations. The System Commonality Analysis Tool (SCAT) is based on an oversimplified model of the problem and incorporates no knowledge acquisition component. In fact, the process of arriving at a compromise between functionality and economy is quite complex, with many opportunities for the application of expert knowledge. Such knowledge is of two types: general knowledge expressible as heuristics or mathematical laws potentially applicable to any set of components, and specific knowledge about the way in which elements of a given set of components interrelate. Examples of both types of knowledge are presented, and a framework is proposed for integrating the knowledge into a more general and useable tool.

  5. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  6. Language Processes and Second-Language Acquisition.

    ERIC Educational Resources Information Center

    Collins, Larry Lloyd

    A review of the literature and research concerning the language processes of listening, speaking, reading, and writing, and an analysis of the findings regarding the characteristics of these processes and their relationship to the second-language learner led to the following conclusions: (1) the circumstances under which the first language is…

  7. A Macro-Level Analysis of SRL Processes and Their Relations to the Acquisition of a Sophisticated Mental Model of a Complex System

    ERIC Educational Resources Information Center

    Greene, Jeffrey Alan; Azevedo, Roger

    2009-01-01

    In this study, we used think-aloud verbal protocols to examine how various macro-level processes of self-regulated learning (SRL; e.g., planning, monitoring, strategy use, handling of task difficulty and demands) were associated with the acquisition of a sophisticated mental model of a complex biological system. Numerous studies examine how…

  8. Acquisition by Processing: A Modular Perspective on Language Development

    ERIC Educational Resources Information Center

    Truscott, John; Smith, Mike Sharwood

    2004-01-01

    The paper offers a model of language development, first and second, within a processing perspective. We first sketch a modular view of language, in which competence is embodied in the processing mechanisms. We then propose a novel approach to language acquisition (Acquisition by Processing Theory, or APT), in which development of the module occurs…

  9. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  10. Guidelines for dynamic data acquisition and analysis

    NASA Technical Reports Server (NTRS)

    Piersol, Allan G.

    1992-01-01

    The recommendations concerning pyroshock data presented in the final draft of a proposed military handbook on Guidelines for Dynamic Data Acquisition and Analysis are reviewed. The structural responses produced by pyroshocks are considered to be one of the most difficult types of dynamic data to accurately measure and analyze.

  11. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  12. Acquisition and analysis of accelerometer data

    NASA Technical Reports Server (NTRS)

    Verges, Keith R.

    1990-01-01

    Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.

  13. TARA control, data acquisition and analysis system

    SciTech Connect

    Gaudreau, M.P.J.; Sullivan, J.D.; Fredian, T.W.; Karcher, C.A.; Sevillano, E.; Stillerman, J.; Thomas, P.

    1983-12-01

    The MIT tandem mirror (TARA) control, data acquisition and analysis system consists of two major parts: (1) a Gould 584 industrial programmable controller (PC) to control engineering functions; and (2) a VAX 11/750 based data acquisition and analysis system for physics analysis. The PC is designed for use in harsh industrial environments and has proven to be a reliable and cost-effective means for automated control. The PC configuration is dedicated to control tasks on the TARA magnet, vacuum, RF, neutral beam, diagnostics, and utility systems. The data transfer functions are used to download system operating parameters from menu selectable tables. Real time status reports are provided to video terminals and as blocks of data to the host computer for storage. The data acquisition and analysis system for TARA was designed to provide high throughput and ready access to data from earlier runs. The adopted design uses pre-existing software packages in a system which is simple, coherent, fast, and which requires a minimum of new software development. The computer configuration is a VAX 11/750 running VMS with 124 M byte massbus disk and 1.4 G byte unibus disk subsystem.

  14. TOM software toolbox: acquisition and analysis for electron tomography.

    PubMed

    Nickell, Stephan; Förster, Friedrich; Linaroudis, Alexandros; Net, William Del; Beck, Florian; Hegerl, Reiner; Baumeister, Wolfgang; Plitzko, Jürgen M

    2005-03-01

    Automated data acquisition procedures have changed the perspectives of electron tomography (ET) in a profound manner. Elaborate data acquisition schemes with autotuning functions minimize exposure of the specimen to the electron beam and sophisticated image analysis routines retrieve a maximum of information from noisy data sets. "TOM software toolbox" integrates established algorithms and new concepts tailored to the special needs of low dose ET. It provides a user-friendly unified platform for all processing steps: acquisition, alignment, reconstruction, and analysis. Designed as a collection of computational procedures it is a complete software solution within a highly flexible framework. TOM represents a new way of working with the electron microscope and can serve as the basis for future high-throughput applications.

  15. The logical syntax of number words: theory, acquisition and processing.

    PubMed

    Musolino, Julien

    2009-04-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language

  16. System of acquisition and processing of images of dynamic speckle

    NASA Astrophysics Data System (ADS)

    Vega, F.; >C Torres,

    2015-01-01

    In this paper we show the design and implementation of a system to capture and analysis of dynamic speckle. The device consists of a USB camera, an isolated system lights for imaging, a laser pointer 633 nm 10 mw as coherent light source, a diffuser and a laptop for processing video. The equipment enables the acquisition and storage of video, also calculated of different descriptors of statistical analysis (vector global accumulation of activity, activity matrix accumulation, cross-correlation vector, autocorrelation coefficient, matrix Fujji etc.). The equipment is designed so that it can be taken directly to the site where the sample for biological study and is currently being used in research projects within the group.

  17. Data acquisition and analysis on a Macintosh

    NASA Technical Reports Server (NTRS)

    Watts, Michael E.; St. Jean, Megan M.

    1991-01-01

    The introduction of inexpensive analog-to-digital boards for the Macintosh opens the way for its use in areas that have previously been filled by either specialized, dedicated or more expensive mainframe based systems. Two such Macintosh-based systems are the Acoustic Laboratory Data Acquisition System (ALDAS) and the Jet Calibration and Hover Test Facility (JCAHT) data acquisition system. ALDAS provides an inexpensive, transportable means to digitize four channels at up to 50,000 samples per second and analyze this data. The ALDAS software package was written for use with rotorcraft acoustics and performs automatic acoustic calibration of channels, data display, and various types of data analysis. The program can use data obtained either from internal analog-to-digital conversion or discrete external data imported in ASCII format. All aspects of ALDAS can be improved as new hardware becomes available and new features are introduced into the code. The JCAHT data acquisition system was built as not only an analysis program but also to act as the online safety monitoring system. This paper will provide an overview of these systems.

  18. Isothermal thermogravimetric data acquisition analysis system

    NASA Technical Reports Server (NTRS)

    Cooper, Kenneth, Jr.

    1991-01-01

    The description of an Isothermal Thermogravimetric Analysis (TGA) Data Acquisition System is presented. The system consists of software and hardware to perform a wide variety of TGA experiments. The software is written in ANSI C using Borland's Turbo C++. The hardware consists of a 486/25 MHz machine with a Capital Equipment Corp. IEEE488 interface card. The interface is to a Hewlett Packard 3497A data acquisition system using two analog input cards and a digital actuator card. The system provides for 16 TGA rigs with weight and temperature measurements from each rig. Data collection is conducted in three phases. Acquisition is done at a rapid rate during initial startup, at a slower rate during extended data collection periods, and finally at a fast rate during shutdown. Parameters controlling the rate and duration of each phase are user programmable. Furnace control (raising and lowering) is also programmable. Provision is made for automatic restart in the event of power failure or other abnormal terminations. Initial trial runs were conducted to show system stability.

  19. Towards a Platform for Image Acquisition and Processing on RASTA

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele

    2013-08-01

    This paper presents the architecture of a platform for image acquisition and processing based on commercial hardware and space qualified hardware. The aim is to extend the Reference Architecture Test-bed for Avionics (RASTA) system in order to obtain a Test-bed that allows testing different hardware and software solutions in the field of image acquisition and processing. The platform will allow the integration of space qualified hardware and Commercial Off The Shelf (COTS) hardware in order to test different architectural configurations. The first implementation is being performed on a low cost commercial board and on the GR712RC board based on the Dual Core Leon3 fault tolerant processor. The platform will include an actuation module with the aim of implementing a complete pipeline from image acquisition to actuation, making possible the simulation of a real case scenario involving acquisition and actuation.

  20. The Gestalt Process Approach and Word Acquisition.

    ERIC Educational Resources Information Center

    McAllister, Elizabeth

    To whet the curiosity and interest of teachers who may be frustrated with the reading vocabulary achievement of pupils, an informal study compared Piaget's cognitive development theory, recent brain research, and the reading process, and examined how the theory and research apply to reading instruction. The Gestalt Process Approach to teaching…

  1. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  2. MTX data acquisition and analysis computer network

    SciTech Connect

    Butner, D.N.; Casper, T.A.; Brown, M.D.; Drlik, M.; Meyer, W.H.; Moller, J.M. )

    1990-10-01

    For the MTX experiment, we use a network of computers for plasma diagnostic data acquisition and analysis. This multivendor network employs VMS, UNIX, and BASIC based computers connected in a local area Ethernet network. Some of the data is acquired directly into a VAX/VMS computer cluster over a fiber-optic serial CAMAC highway. Several HP-Unix workstations and HP-BASIC instrument control computers acquire and analyze data for the more data intensive or specialized diagnostics. The VAX/VMS system is used for global analysis of the data and serves as the central data archiving and retrieval manager. Shot synchronization and control of data flow are implemented by task-to-task message passing using our interprocess communication system. The system has been in operation during our initial MTX tokamak and FEL experiments; it has operated reliably with data rates typically in the range of 5 Mbytes/shot without limiting the experimental shot rate.

  3. Reading Acquisition Enhances an Early Visual Process of Contour Integration

    ERIC Educational Resources Information Center

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched…

  4. From chaos to order: The MicroStar data acquisition and analysis system

    SciTech Connect

    Rathbun, W.

    1991-03-01

    The MicroStar data acquisition and analysis software consists of several independent processes, although to the user it looks like a single program. These programs handle such functions as data acquisition (if any), data analysis, interactive display and data manipulation, and process monitoring and control. An overview of these processes and functions can be found in the paper as well as a more detailed description and users' guide.

  5. Understanding the knowledge acquisition process about Earth and Space concepts

    NASA Astrophysics Data System (ADS)

    Frappart, Soren

    There exist two main theoretical views concerning the knowledge acquisition process in science. Those views are still in debate in the literature. On the one hand, knowledge is considered to be organized into coherent wholes (mental models). On the other hand knowledge is described as fragmented sets with no link between the fragments. Mental models have a predictive and explicative power and are constrained by universal presuppositions. They follow a universal gradual development in three steps from initial, synthetic to scientific models. On the contrary, the fragments are not organised and development is seen as a situated process where cultural transmission plays a fundamental role. After a presentation of those two theoretical positions, we will illustrate them with examples of studies related to the Earth Shape and gravity performed in different cultural contexts in order to enhance both the differences and the invariant cultural elements. We will show how those problematic are important to take into account and to question for space concepts, like gravity, orbits, weightlessness for instance. Indeed capturing the processes of acquisition and development of knowledge concerning specific space concepts can give us important information to develop relevant and adapted strategies for instruction. If the process of knowledge acquisition for Space concepts is fragmented then we have to think of how we could identify those fragments and help the learner organise links between them. If the knowledge is organised into coherent mental models, we have to think of how to destabilize a non relevant model and to prevent from the development of initial and synthetic models. Moreover the question of what is universal versus what is culture dependant in this acquisition process need to be explored. We will also present some main misconceptions that appeared about Space concepts. Indeed, additionally to the previous theoretical consideration, the collection and awareness of

  6. Advances in GPR data acquisition and analysis for archaeology

    NASA Astrophysics Data System (ADS)

    Zhao, Wenke; Tian, Gang; Forte, Emanuele; Pipan, Michele; Wang, Yimin; Li, Xuejing; Shi, Zhanjie; Liu, Haiyan

    2015-07-01

    The primary objective of this study is to evaluate the applicability and the effectiveness of ground-penetrating radar (GPR) to identify a thin burnt soil layer, buried more than 2 m below the topographic surface at the Liangzhu Site, in Southeastern China. The site was chosen for its relatively challenging conditions of GPR techniques due to electrical conductivity and to the presence of peach tree roots that produced scattering. We completed the data acquisition by using 100 and 200 MHz antennas in TE and TM broadside and cross-polarized configurations. In the data processing and interpretation phase, we used GPR attribute analysis, including instantaneous phase and geometrical attributes. Validation analysis ground-truthing performed after the geophysical surveys, validated the GPR imaging, confirmed the electrical conductivity and relative dielectric permittivity (RDP) measurements performed at different depths, and allowed a reliable quantitative correlation between GPR results and subsurface physical properties. The research demonstrates that multiple antenna configurations in GPR data acquisition combined with attribute analysis can enhance the ability to characterize prehistoric archaeological remains even in complex subsurface conditions.

  7. The acquisition of integrated science process skills in a web-based learning environment

    NASA Astrophysics Data System (ADS)

    Saat, Rohaida Mohd.

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among grade 5 children. Data were gathered primarily from children's conversations and teacher-student conversations. Analysis of the data revealed that the children acquired the skill in three phases: from the phase of recognition to the phase of familiarization and finally to the phase of automation. Nevertheless, the acquisition of the skill only involved the acquisition of certain subskills of the skill of controlling variables. This progression could be influenced by the web-based instructional material that provided declarative knowledge, concrete visualization and opportunities for practise.

  8. An effective data acquisition system using image processing

    NASA Astrophysics Data System (ADS)

    Poh, Chung-How; Poh, Chung-Kiak

    2005-12-01

    The authors investigate a data acquisition system utilising the widely available digital multi-meter and the webcam. The system is suited for applications that require sampling rates of less than about 1 Hz, such as for ambient temperature recording or the monitoring of the charging state of rechargeable batteries. The data displayed on the external digital readout is acquired into the computer through the process of template matching. MATLAB is used as the programming language for processing the captured 2-D images in this demonstration. A RC charging experiment with a time characteristic of approximately 33 s is setup to verify the accuracy of the image-to-data conversion. It is found that the acquired data matches the steady-state voltage value displayed by the digital meter after an error detection technique has been devised and implemented into the data acquisition script file. It is possible to acquire a number of different readings simultaneously from various sources with this imaging method by placing a number of digital readouts within the camera's field-of-view.

  9. Reading acquisition enhances an early visual process of contour integration.

    PubMed

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched in age, socioeconomic and cultural characteristics, on a contour integration task known to depend on early visual processing. Stimuli consisted of a closed egg-shaped contour made of disconnected Gabor patches, within a background of randomly oriented Gabor stimuli. Subjects had to decide whether the egg was pointing left or right. Difficulty was varied by jittering the orientation of the Gabor patches forming the contour. Contour integration performance was lower in illiterates than in both ex-illiterate and literate controls. We argue that this difference in contour perception must reflect a genuine difference in visual function. According to this view, the intensive perceptual training that accompanies reading acquisition also improves early visual abilities, suggesting that the impact of literacy on the visual system is more widespread than originally proposed.

  10. PET/CT for radiotherapy: image acquisition and data processing.

    PubMed

    Bettinardi, V; Picchio, M; Di Muzio, N; Gianolli, L; Messa, C; Gilardi, M C

    2010-10-01

    This paper focuses on acquisition and processing methods in positron emission tomography/computed tomography (PET/CT) for radiotherapy (RT) applications. The recent technological evolutions of PET/CT systems are described. Particular emphasis is dedicated to the tools needed for the patient positioning and immobilization, to be used in PET/CT studies as well as during RT treatment sessions. The effect of organ and lesion motion due to patient's respiration on PET/CT imaging is discussed. Breathing protocols proposed to minimize PET/CT spatial mismatches in relation to respiratory movements are illustrated. The respiratory gated (RG) 4D-PET/CT techniques, developed to measure and compensate for organ and lesion motion, are then introduced. Finally a description is provided of different acquisition and data processing techniques, implemented with the aim at improving: i) image quality and quantitative accuracy of PET images, and ii) target volume definition and treatment planning in RT, by using specific and personalised motion information.

  11. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use...

  12. NOVA-NREL Optimal Vehicle Acquisition Analysis (Brochure)

    SciTech Connect

    Blakley, H.

    2011-03-01

    Federal fleet managers face unique challenges in accomplishing their mission - meeting agency transportation needs while complying with Federal goals and mandates. Included in these challenges are a variety of statutory requirements, executive orders, and internal goals and objectives that typically focus on petroleum consumption and greenhouse gas (GHG) emissions reductions, alternative fuel vehicle (AFV) acquisitions, and alternative fuel use increases. Given the large number of mandates affecting Federal fleets and the challenges faced by all fleet managers in executing day-to-day operations, a primary challenge for agencies and other organizations is ensuring that they are as efficient as possible in using constrained fleet budgets. An NREL Optimal Vehicle Acquisition (NOVA) analysis makes use of a mathematical model with a variety of fleet-related data to create an optimal vehicle acquisition strategy for a given goal, such as petroleum or GHG reduction. The analysis can helps fleets develop a vehicle acquisition strategy that maximizes petroleum and greenhouse gas reductions.

  13. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  14. Data acquisition, processing and firing aid software for multichannel EMP simulation

    NASA Astrophysics Data System (ADS)

    Eumurian, Gregoire; Arbaud, Bruno

    1986-08-01

    Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.

  15. Development of data acquisition and analysis software for multichannel detectors

    SciTech Connect

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs.

  16. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... published a proposed rule in the Federal Register at 77 FR 40552 on July 10, 2012, to clarify and pinpoint a... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a...

  17. MS1 Peptide Ion Intensity Chromatograms in MS2 (SWATH) Data Independent Acquisitions. Improving Post Acquisition Analysis of Proteomic Experiments.

    PubMed

    Rardin, Matthew J; Schilling, Birgit; Cheng, Lin-Yang; MacLean, Brendan X; Sorensen, Dylan J; Sahu, Alexandria K; MacCoss, Michael J; Vitek, Olga; Gibson, Bradford W

    2015-09-01

    Quantitative analysis of discovery-based proteomic workflows now relies on high-throughput large-scale methods for identification and quantitation of proteins and post-translational modifications. Advancements in label-free quantitative techniques, using either data-dependent or data-independent mass spectrometric acquisitions, have coincided with improved instrumentation featuring greater precision, increased mass accuracy, and faster scan speeds. We recently reported on a new quantitative method called MS1 Filtering (Schilling et al. (2012) Mol. Cell. Proteomics 11, 202-214) for processing data-independent MS1 ion intensity chromatograms from peptide analytes using the Skyline software platform. In contrast, data-independent acquisitions from MS2 scans, or SWATH, can quantify all fragment ion intensities when reference spectra are available. As each SWATH acquisition cycle typically contains an MS1 scan, these two independent label-free quantitative approaches can be acquired in a single experiment. Here, we have expanded the capability of Skyline to extract both MS1 and MS2 ion intensity chromatograms from a single SWATH data-independent acquisition in an Integrated Dual Scan Analysis approach. The performance of both MS1 and MS2 data was examined in simple and complex samples using standard concentration curves. Cases of interferences in MS1 and MS2 ion intensity data were assessed, as were the differentiation and quantitation of phosphopeptide isomers in MS2 scan data. In addition, we demonstrated an approach for optimization of SWATH m/z window sizes to reduce interferences using MS1 scans as a guide. Finally, a correlation analysis was performed on both MS1 and MS2 ion intensity data obtained from SWATH acquisitions on a complex mixture using a linear model that automatically removes signals containing interferences. This work demonstrates the practical advantages of properly acquiring and processing MS1 precursor data in addition to MS2 fragment ion

  18. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  19. The Logical Syntax of Number Words: Theory, Acquisition and Processing

    ERIC Educational Resources Information Center

    Musolino, Julien

    2009-01-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. "Cognition 93", 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar…

  20. Data acquisition and analysis for the energy-subtraction Compton scatter camera for medical imaging

    NASA Astrophysics Data System (ADS)

    Khamzin, Murat Kamilevich

    In response to the shortcomings of the Anger camera currently being used in conventional SPECT, particularly the trade-off between sensitivity and spatial resolution, a novel energy-subtraction Compton scatter camera, or the ESCSC, has been proposed. A successful clinical implementation of the ESCSC could revolutionize the field of SPECT. Features of this camera include utilization of silicon and CdZnTe detectors in primary and secondary detector systems, list-mode time stamping data acquisition, modular architecture, and post-acquisition data analysis. Previous ESCSC studies were based on Monte Carlo modeling. The objective of this work is to test the theoretical framework developed in previous studies by developing the data acquisition and analysis techniques necessary to implement the ESCSC. The camera model working in list-mode with time stamping was successfully built and tested thus confirming potential of the ESCSC that was predicted in previous simulation studies. The obtained data were processed during the post-acquisition data analysis based on preferred event selection criteria. Along with the construction of a camera model and proving the approach, the post-acquisition data analysis was further extended to include preferred event weighting based on the likelihood of a preferred event to be a true preferred event. While formulated to show ESCSC capabilities, the results of this study are important for any Compton scatter camera implementation as well as for coincidence data acquisition systems in general.

  1. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  2. FABIA: factor analysis for bicluster acquisition

    PubMed Central

    Hochreiter, Sepp; Bodenhofer, Ulrich; Heusel, Martin; Mayr, Andreas; Mitterecker, Andreas; Kasim, Adetayo; Khamiakova, Tatsiana; Van Sanden, Suzy; Lin, Dan; Talloen, Willem; Bijnens, Luc; Göhlmann, Hinrich W. H.; Shkedy, Ziv; Clevert, Djork-Arné

    2010-01-01

    Motivation: Biclustering of transcriptomic data groups genes and samples simultaneously. It is emerging as a standard tool for extracting knowledge from gene expression measurements. We propose a novel generative approach for biclustering called ‘FABIA: Factor Analysis for Bicluster Acquisition’. FABIA is based on a multiplicative model, which accounts for linear dependencies between gene expression and conditions, and also captures heavy-tailed distributions as observed in real-world transcriptomic data. The generative framework allows to utilize well-founded model selection methods and to apply Bayesian techniques. Results: On 100 simulated datasets with known true, artificially implanted biclusters, FABIA clearly outperformed all 11 competitors. On these datasets, FABIA was able to separate spurious biclusters from true biclusters by ranking biclusters according to their information content. FABIA was tested on three microarray datasets with known subclusters, where it was two times the best and once the second best method among the compared biclustering approaches. Availability: FABIA is available as an R package on Bioconductor (http://www.bioconductor.org). All datasets, results and software are available at http://www.bioinf.jku.at/software/fabia/fabia.html Contact: hochreit@bioinf.jku.at Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20418340

  3. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, John R.

    1997-01-01

    A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.

  4. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, J.R.

    1997-02-11

    A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.

  5. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.

  6. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  7. Developmental Stages in Receptive Grammar Acquisition: A Processability Theory Account

    ERIC Educational Resources Information Center

    Buyl, Aafke; Housen, Alex

    2015-01-01

    This study takes a new look at the topic of developmental stages in the second language (L2) acquisition of morphosyntax by analysing receptive learner data, a language mode that has hitherto received very little attention within this strand of research (for a recent and rare study, see Spinner, 2013). Looking at both the receptive and productive…

  8. Cognitive Skill Acquisition through a Meta-Knowledge Processing Model.

    ERIC Educational Resources Information Center

    McKay, Elspeth

    2002-01-01

    The purpose of this paper is to reopen the discourse on cognitive skill acquisition to focus on the interactive effect of differences in cognitive construct and instructional format. Reports an examination of the contextual issues involved in understanding the interactivity of instructional conditions and cognitive style as a meta-knowledge…

  9. Metadiscursive Processes in the Acquisition of a Second Language.

    ERIC Educational Resources Information Center

    Giacomi, Alain; Vion, Robert

    1986-01-01

    The acquisition of narrative competence in French by an Arabic-speaking migrant worker in interactions with target language speakers was explored, with hypotheses formed about the polyfunctional uses of certain forms to mark the chronology of events in the narrative or to introduce quoted speech. (Author/CB)

  10. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  11. Acquisition and Processing of Multi-Fold GPR Data for Characterization of Shallow Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Bradford, J. H.

    2004-05-01

    Most ground-penetrating radar (GPR) data are acquired with a constant transmitter-receiver offset and often investigators apply little or no processing in generating a subsurface image. This mode of operation can provide useful information, but does not take full advantage of the information the GPR signal can carry. In continuous multi-offset (CMO) mode, one acquires several traces with varying source-receiver separations at each point along the survey. CMO acquisition is analogous to common-midpoint acquisition in exploration seismology and gives rise to improved subsurface characterization through three key features: 1) Processes such as stacking and velocity filtering significantly attenuate coherent and random noise resulting in subsurface images that are easier to interpret, 2) CMO data enable measurement of vertical and lateral velocity variations which leads to improved understanding of material distribution and more accurate depth estimates, and 3) CMO data enable observation of reflected wave behaviour (ie variations in amplitude and spectrum) at a common reflection point for various travel paths through the subsurface - quantification of these variations can be a valuable tool in material property characterization. Although there are a few examples in the literature, investigators rarely acquire CMO GPR data. This is, in large part, due to the fact that CMO acquisition with a single channel system is labor intensive and time consuming. At present, no multi-channel GPR systems designed for CMO acquisition are commercially available. Over the past 8 years I have designed, conducted, and processed numerous 2D and 3D CMO GPR surveys using a single channel GPR system. I have developed field procedures that enable a three man crew to acquire CMO GPR data at a rate comparable to a similar scale multi-channel seismic reflection survey. Additionally, many recent advances in signal processing developed in the oil and gas industry have yet to see significant

  12. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future.

  13. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future. PMID:25430050

  14. Data acquisition and analysis using the IBM Computer System 9000

    SciTech Connect

    Mueller, G.E.

    1985-10-01

    A data-acquisition, analysis, and graphing program has been developed on the IBM CS-9000 multitask computer to support the UNM/SNL/GA Thermal-Hydraulic Test Facility. The software has been written in Computer System BASIC which allows accessing and configuring I/O devices. The CS-9000 has been interfaced with an HP 3497A Data Acquisition/Control Unit and an HP 7470A Graphics Plotter through the IEEE-488 Bus. With this configuration the system is capable of scanning 60 channels of analog thermocuple compensated input, 20 channels of analog pressure transducer input, and 16 channels of digital mass flow rate input. The CS-9000 graphics coupled with the HP 7470A provides useful visualization of changes in measured parameters. 8 refs., 7 figs.

  15. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  16. Research and design of portable photoelectric rotary table data-acquisition and analysis system

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table as the main test tracking measurement platform, widely use in shooting range and aerospace fields. In the range of photoelectric tracking measurement system, in order to meet the photoelectric testing instruments and equipment of laboratory and field application demand, research and design the portable photoelectric rotary table data acquisition and analysis system, and introduces the FPGA device based on Xilinx company Virtex-4 series and its peripheral module of the system hardware design, and the software design of host computer in VC++ 6.0 programming platform and MFC package based on class libraries. The data acquisition and analysis system for data acquisition, display and storage, commission control, analysis, laboratory wave playback, transmission and fault diagnosis, and other functions into an organic whole, has the advantages of small volume, can be embedded, high speed, portable, simple operation, etc. By photoelectric tracking turntable as experimental object, carries on the system software and hardware alignment, the experimental results show that the system can realize the data acquisition, analysis and processing of photoelectric tracking equipment and control of turntable debugging good, and measurement results are accurate, reliable and good maintainability and extensibility. The research design for advancing the photoelectric tracking measurement equipment debugging for diagnosis and condition monitoring and fault analysis as well as the standardization and normalization of the interface and improve the maintainability of equipment is of great significance, and has certain innovative and practical value.

  17. Phases of learning: How skill acquisition impacts cognitive processing.

    PubMed

    Tenison, Caitlin; Fincham, Jon M; Anderson, John R

    2016-06-01

    This fMRI study examines the changes in participants' information processing as they repeatedly solve the same mathematical problem. We show that the majority of practice-related speedup is produced by discrete changes in cognitive processing. Because the points at which these changes take place vary from problem to problem, and the underlying information processing steps vary in duration, the existence of such discrete changes can be hard to detect. Using two converging approaches, we establish the existence of three learning phases. When solving a problem in one of these learning phases, participants can go through three cognitive stages: Encoding, Solving, and Responding. Each cognitive stage is associated with a unique brain signature. Using a bottom-up approach combining multi-voxel pattern analysis and hidden semi-Markov modeling, we identify the duration of that stage on any particular trial from participants brain activation patterns. For our top-down approach we developed an ACT-R model of these cognitive stages and simulated how they change over the course of learning. The Solving stage of the first learning phase is long and involves a sequence of arithmetic computations. Participants transition to the second learning phase when they can retrieve the answer, thereby drastically reducing the duration of the Solving stage. With continued practice, participants then transition to the third learning phase when they recognize the problem as a single unit and produce the answer as an automatic response. The duration of this third learning phase is dominated by the Responding stage.

  18. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  19. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  20. A multiple process solution to the logical problem of language acquisition.

    PubMed

    MacWhinney, Brian

    2004-11-01

    Many researchers believe that there is a logical problem at the centre of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load.

  1. Superimposed fringe projection for three-dimensional shape acquisition by image analysis.

    PubMed

    Sasso, Marco; Chiappini, Gianluca; Palmieri, Giacomo; Amodio, Dario

    2009-05-01

    The aim in this work is the development of an image analysis technique for 3D shape acquisition, based on luminous fringe projections. In more detail, the method is based on the simultaneous use of several projectors, which is desirable whenever the surface under inspection has a complex geometry, with undercuts or shadow areas. In these cases, the usual fringe projection technique needs to perform several acquisitions, each time moving the projector or using several projectors alternately. Besides the procedure of fringe projection and phase calculation, an unwrap algorithm has been developed in order to obtain continuous phase maps needed in following calculations for shape extraction. With the technique of simultaneous projections, oriented in such a way to cover all of the surface, it is possible to increase the speed of the acquisition process and avoid the postprocessing problems related to the matching of different point clouds.

  2. Memory acquisition and retrieval impact different epigenetic processes that regulate gene expression

    PubMed Central

    2015-01-01

    Background A fundamental question in neuroscience is how memories are stored and retrieved in the brain. Long-term memory formation requires transcription, translation and epigenetic processes that control gene expression. Thus, characterizing genome-wide the transcriptional changes that occur after memory acquisition and retrieval is of broad interest and importance. Genome-wide technologies are commonly used to interrogate transcriptional changes in discovery-based approaches. Their ability to increase scientific insight beyond traditional candidate gene approaches, however, is usually hindered by batch effects and other sources of unwanted variation, which are particularly hard to control in the study of brain and behavior. Results We examined genome-wide gene expression after contextual conditioning in the mouse hippocampus, a brain region essential for learning and memory, at all the time-points in which inhibiting transcription has been shown to impair memory formation. We show that most of the variance in gene expression is not due to conditioning and that by removing unwanted variance through additional normalization we are able provide novel biological insights. In particular, we show that genes downregulated by memory acquisition and retrieval impact different functions: chromatin assembly and RNA processing, respectively. Levels of histone 2A variant H2AB are reduced only following acquisition, a finding we confirmed using quantitative proteomics. On the other hand, splicing factor Rbfox1 and NMDA receptor-dependent microRNA miR-219 are only downregulated after retrieval, accompanied by an increase in protein levels of miR-219 target CAMKIIγ. Conclusions We provide a thorough characterization of coding and non-coding gene expression during long-term memory formation. We demonstrate that unwanted variance dominates the signal in transcriptional studies of learning and memory and introduce the removal of unwanted variance through normalization as a

  3. The Materials Acquisition Process at the University of Technology, Sydney: Equitable Transparent Allocation of Funds.

    ERIC Educational Resources Information Center

    O'Connor, Steve; Flynn, Ann; Lafferty, Susan

    1998-01-01

    Discusses the development of a library acquisition allocation formula at the University of Technology, Sydney. Covers the items included, consultative process adopted, details of the formulae derived and their implementation. (Author)

  4. Data acquisition and processing system of the electron cyclotron emission imaging system of the KSTAR tokamak

    SciTech Connect

    Kim, J. B.; Lee, W.; Yun, G. S.; Park, H. K.; Domier, C. W.; Luhmann, N. C. Jr.

    2010-10-15

    A new innovative electron cyclotron emission imaging (ECEI) diagnostic system for the Korean Superconducting Tokamak Advanced Research (KSTAR) produces a large amount of data. The design of the data acquisition and processing system of the ECEI diagnostic system should consider covering the large data production and flow. The system design is based on the layered structure scalable to the future extension to accommodate increasing data demands. Software architecture that allows a web-based monitoring of the operation status, remote experiment, and data analysis is discussed. The operating software will help machine operators and users validate the acquired data promptly, prepare next discharge, and enhance the experiment performance and data analysis in a distributed environment.

  5. Analysis Method for Non-Nominal First Acquisition

    NASA Technical Reports Server (NTRS)

    Sieg, Detlef; Mugellesi-Dow, Roberta

    2007-01-01

    First this paper describes a method how the trajectory of the launcher can be modelled for the contingency analysis without having much information about the launch vehicle itself. From a dense sequence of state vectors a velocity profile is derived which is sufficiently accurate to enable the Flight Dynamics Team to integrate parts of the launcher trajectory on its own and to simulate contingency cases by modifying the velocity profile. Then the paper focuses on the thorough visibility analysis which has to follow the contingency case or burn performance simulations. In the ideal case it is possible to identify a ground station which is able to acquire the satellite independent from the burn performance. The correlations between the burn performance and the pointing at subsequent ground stations are derived with the aim of establishing simple guidelines which can be applied quickly and which significantly improve the chance of acquisition at subsequent ground stations. In the paper the method is applied to the Soyuz/Fregat launch with the MetOp satellite. Overall the paper shows that the launcher trajectory modelling with the simulation of contingency cases in connection with a ground station visibility analysis leads to a proper selection of ground stations and acquisition methods. In the MetOp case this ensured successful contact of all ground stations during the first hour after separation without having to rely on any early orbit determination result or state vector update.

  6. Relationships among process skills development, knowledge acquisition, and gender in microcomputer-based chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Krieger, Carla Repsher

    This study investigated how instruction in MBL environments can be designed to facilitate process skills development and knowledge acquisition among high school chemistry students. Ninety-eight college preparatory chemistry students in six intact classes were randomly assigned to one of three treatment groups: MBL with enhanced instruction in Macroscopic knowledge, MBL with enhanced instruction in Microscopic knowledge, and MBL with enhanced instruction in Symbolic knowledge. Each treatment group completed a total of four MBL titrations involving acids and bases. After the first and third titrations, the Macroscopic, Microscopic and Symbolic groups received enhanced instruction in the Macroscopic, Microscopic and Symbolic modes, respectively. During each titration, participants used audiotapes to record their verbal interactions. The study also explored the effects of three potential covariates (age, mathematics background, and computer usage) on the relationships among the independent variables (type of enhanced instruction and gender) and the dependent variables (science process skills and knowledge acquisition). Process skills were measured via gain scores on a standardized test. Analysis of Covariance eliminated age, mathematics background, and computer usage as covariates in this study. Analysis of Variance identified no significant effects on process skills attributable to treatment or gender. Knowledge acquisition was assessed via protocol analysis of statements made by the participants during the four titrations. Statements were categorized as procedural, observational, conceptual/analytical, or miscellaneous. Statement category percentages were analyzed for trends across treatments, genders, and experiments. Instruction emphasizing the Macroscopic mode may have increased percentages of observational and miscellaneous statements and decreased percentages of procedural and conceptual/analytical statements. Instruction emphasizing the Symbolic mode may have

  7. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  9. Data acquisition and analysis at the Structural Biology Center

    SciTech Connect

    Westbrook, M.L.; Coleman, T.A.; Daly, R.T.; Pflugrath, J.W.

    1996-12-31

    The Structural Biology Center (SBC), a national user facility for macromolecular crystallography located at Argonne National Laboratory`s Advanced Photon Source, is currently being built and commissioned. SBC facilities include a bending-magnet beamline, an insertion-device beamline, laboratory and office space adjacent to the beamlines, and associated instrumentation, experimental apparatus, and facilities. SBC technical facilities will support anomalous dispersion phasing experiments, data collection from microcrystals, data collection from crystals with large molecular structures and rapid data collection from multiple related crystal structures for protein engineering and drug design. The SBC Computing Systems and Software Engineering Group is tasked with developing the SBC Control System, which includes computing systems, network, and software. The emphasis of SBC Control System development has been to provide efficient and convenient beamline control, data acquisition, and data analysis for maximal facility and experimenter productivity. This paper describes the SBC Control System development, specifically data acquisition and analysis at the SBC, and the development methods used to meet this goal.

  10. Instrumentation for automated acquisition and analysis of TLD glow curves

    NASA Astrophysics Data System (ADS)

    Bostock, I. J.; Kennett, T. J.; Harvey, J. W.

    1991-04-01

    Instrumentation for the automated and complete acquisition of thermoluminescent dosimeter (TLD) data from a Panasonic UD-702E TLD reader is reported. The system that has been developed consists of both hardware and software components and is designed to operate with an IBM-type personal computer. Acquisition of glow curve, timing, and heating data has been integrated with elementary numerical analysis to permit real-time validity and diagnostic assessments to be made. This allows the optimization of critical parameters such as duration of the heating cycles and the time window for the integration of the dosimetry peak. The form of the Li 2B 4O 7:Cu TLD glow curve has been studied and a mathematical representation devised to assist in the implementation of automated analysis. Differences in the shape of the curve can be used to identify dosimetry peaks due to artifacts or to identify failing components. Examples of the use of this system for quality assurance in the TLD monitoring program at McMaster University are presented.

  11. Multiwavelength lidar: challenges of data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Duggirala, Ramakrishna Rao; Santhibhavan Vasudevanpillai, Mohankumar; Bhargavan, Presennakumar; Sivarama Pillai, Muraleedharen Nair; Malladi, Satyanarayana

    2006-12-01

    LIDAR operates by transmitting light pulses of few nanoseconds width into the atmosphere and receiving signals backscattered from different layers of aerosols and clouds from the atmosphere to derive vertical profiles of the physical and optical properties with good spatial resolution. The Data Acquisition System (DAS) of the LIDAR has to handle signals of wide dynamic range (of the order of 5 to 6 decades), and the data have to be sampled at high speeds (more than 10 MSPS) to get spatial resolution of few metre. This results in large amount of data to be collected in a short duration. The ground based Multiwavelength LIDAR built in Space Physics Laboratory, Vikram Sarabhai Space Centre, Trivandrum is capable of operating at four wavelengths namely 1064, 532, 355 and 266 nm with a PRF of 1 to 20 Hz. The LIDAR has been equipped with a computer controlled DAS. An Avalanche Photo Diode (APD) detector is used for the detection of return signal from different layers of atmosphere in 1064 nm channel. The signal is continuous in nature and is sampled and digitized at the required spatial resolution in the data acquisition window corresponding to the height region of 0 to 45 km. The return signal which is having wide dynamic range is handled by two fast, 12 bit A/D converters set to different full scale voltage ranges, and sampling upto 40 MSPS (corresponding to the range resolution of few metre). The other channels, namely 532, 355 and 266 nm are detected by Photo Multiplier Tubes (PMT), which have higher quantum efficiency at these wavelengths. The PMT output can be either continuous or discrete pulses depending upon the region of probing. Thick layers like clouds and dust generate continuous signal whereas molecular scattering from the higher altitude regions result in discrete signal pulses. The return signals are digitized using fast A/D converters (upto 40 MSPS) as well as counted using fast photon counters. The photon counting channels are capable of counting upto

  12. Xenbase; core features, data acquisition and data processing

    PubMed Central

    James-Zorn, Christina; Ponferrada, Virgillio G.; Burns, Kevin A.; Fortriede, Joshua D.; Lotay, Vaneet S.; Liu, Yu; Karpinka, J. Brad; Karimi, Kamran; Zorn, Aaron M.; Vize, Peter D.

    2015-01-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query, and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB and Ensembl. Here we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features and the curation of gene nomenclature and gene models. PMID:26150211

  13. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models.

  14. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models. PMID:26150211

  15. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  16. System safety management lessons learned from the US Army acquisition process

    SciTech Connect

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They are broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.

  17. MIRAGE: The data acquisition, analysis, and display system

    NASA Technical Reports Server (NTRS)

    Rosser, Robert S.; Rahman, Hasan H.

    1993-01-01

    Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.

  18. Design and implementation of photoelectric rotary table data acquisition and analysis system host computer software based on VC++ and MFC

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table is mainly used in the defense industry and military fields, especially in the shooting range, target tracking, target acquisition, aerospace aspects play an important one. For range photoelectric measuring equipment field test application requirements, combined with a portable photoelectric rotary table data acquisition hardware system, software programming platform is presented based on the VC++, using MFC prepared PC interface, the realization of photoelectric turntable data acquisition, analysis and processing and debugging control. The host computer software design of serial communication and protocol, real-time data acquisition and display, real-time data curve drawing, analog acquisition, debugging guide, error analysis program, and gives the specific design method. Finally, through the photoelectric rotary table data acquisition hardware system alignment, the experimental results show that host computer software can better accomplish with lower machine data transmission, data acquisition, control and analysis, and to achieve the desired effect, the entire software system running performance is stable, flexible, strong practicality and reliability, the advantages of good scalability.

  19. Evolutionary analysis of iron (Fe) acquisition system in Marchantia polymorpha.

    PubMed

    Lo, Jing-Chi; Tsednee, Munkhtsetseg; Lo, Ying-Chu; Yang, Shun-Chung; Hu, Jer-Ming; Ishizaki, Kimitsune; Kohchi, Takayuki; Lee, Der-Chuen; Yeh, Kuo-Chen

    2016-07-01

    To acquire appropriate iron (Fe), vascular plants have developed two unique strategies, the reduction-based strategy I of nongraminaceous plants for Fe(2+) and the chelation-based strategy II of graminaceous plants for Fe(3+) . However, the mechanism of Fe uptake in bryophytes, the earliest diverging branch of land plants and dominant in gametophyte generation is less clear. Fe isotope fractionation analysis demonstrated that the liverwort Marchantia polymorpha uses reduction-based Fe acquisition. Enhanced activities of ferric chelate reductase and proton ATPase were detected under Fe-deficient conditions. However, M. polymorpha did not show mugineic acid family phytosiderophores, the key components of strategy II, or the precursor nicotianamine. Five ZIP (ZRT/IRT-like protein) homologs were identified and speculated to be involved in Fe uptake in M. polymorpha. MpZIP3 knockdown conferred reduced growth under Fe-deficient conditions, and MpZIP3 overexpression increased Fe content under excess Fe. Thus, a nonvascular liverwort, M. polymorpha, uses strategy I for Fe acquisition. This system may have been acquired in the common ancestor of land plants and coopted from the gametophyte to sporophyte generation in the evolution of land plants. PMID:26948158

  20. Cognitive processes during fear acquisition and extinction in animals and humans

    PubMed Central

    Hofmann, Stefan G.

    2007-01-01

    Anxiety disorders are highly prevalent. Fear conditioning and extinction learning in animals often serve as simple models of fear acquisition and exposure therapy of anxiety disorders in humans. This article reviews the empirical and theoretical literature on cognitive processes in fear acquisition, extinction, and exposure therapy. It is concluded that exposure therapy is a form of cognitive intervention that specifically changes the expectancy of harm. Implications for therapy research are discussed. PMID:17532105

  1. Lock Acquisition and Sensitivity Analysis of Advanced LIGO Interferometers

    NASA Astrophysics Data System (ADS)

    Martynov, Denis

    Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe. The initial phase of LIGO started in 2002, and since then data was collected during the six science runs. Instrument sensitivity improved from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010. In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation of detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted until 2014. This thesis describes results of commissioning work done at the LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers. The first part of this thesis is devoted to the description of methods for bringing the interferometer into linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details. Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in

  2. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P. ); Elliott, A. )

    1992-01-01

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  3. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P.; Elliott, A.

    1992-12-31

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  4. Information Processing, Knowledge Acquisition and Learning: Developmental Perspectives.

    ERIC Educational Resources Information Center

    Hoyer, W. J.

    1980-01-01

    Several different conceptions of the relationship between learning and development are considered in this article. It is argued that dialectical and ecological developmental orientations might provide a useful basis for synthesizing the contrasting frameworks of the operant, information processing, learning theory, and knowledge acquisition…

  5. Semantic Context and Graphic Processing in the Acquisition of Reading.

    ERIC Educational Resources Information Center

    Thompson, G. B.

    1981-01-01

    Two experiments provided tests of predictions about children's use of semantic contextual information in reading, under conditions of minimal experience with graphic processes. Subjects, aged 6 1/2, 8, and 11, orally read passages of continuous text with normal and with low semantic constraints under various graphic conditions, including cursive…

  6. Executive and Phonological Processes in Second-Language Acquisition

    ERIC Educational Resources Information Center

    Engel de Abreu, Pascale M. J.; Gathercole, Susan E.

    2012-01-01

    This article reports a latent variable study exploring the specific links among executive processes of working memory, phonological short-term memory, phonological awareness, and proficiency in first (L1), second (L2), and third (L3) languages in 8- to 9-year-olds experiencing multilingual education. Children completed multiple L1-measures of…

  7. An Information-Processing Approach to Skill Acquisition: Movement Organization.

    ERIC Educational Resources Information Center

    Spaeth-Arnold, Ree K.

    This document examines how subjects accomplish the task of matching movements to the characteristics of the performance environment. Literature pertaining to this topic is reviewed from the areas of time/motion study of industrial tasks, timing of coincidence-anticipation tasks, and cinematographic analysis of motor skill performance. A research…

  8. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  9. Data acquisition and online processing requirements for experimentation at the Superconducting Super Collider

    SciTech Connect

    Lankford, A.J.; Barsotti, E.; Gaines, I.

    1989-07-01

    Differences in scale between data acquisition and online processing requirements for detectors at the Superconducting Super Collider and systems for existing large detectors will require new architectures and technological advances in these systems. Emerging technologies will be employed for data transfer, processing, and recording. 9 refs., 3 figs.

  10. A dual process account of coarticulation in motor skill acquisition.

    PubMed

    Shah, Ashvin; Barto, Andrew G; Fagg, Andrew H

    2013-01-01

    Many tasks, such as typing a password, are decomposed into a sequence of subtasks that can be accomplished in many ways. Behavior that accomplishes subtasks in ways that are influenced by the overall task is often described as "skilled" and exhibits coarticulation. Many accounts of coarticulation use search methods that are informed by representations of objectives that define skilled. While they aid in describing the strategies the nervous system may follow, they are computationally complex and may be difficult to attribute to brain structures. Here, the authors present a biologically- inspired account whereby skilled behavior is developed through 2 simple processes: (a) a corrective process that ensures that each subtask is accomplished, but does not do so skillfully and (b) a reinforcement learning process that finds better movements using trial and error search that is not informed by representations of any objectives. We implement our account as a computational model controlling a simulated two-armed kinematic "robot" that must hit a sequence of goals with its hands. Behavior displays coarticulation in terms of which hand was chosen, how the corresponding arm was used, and how the other arm was used, suggesting that the account can participate in the development of skilled behavior. PMID:24116847

  11. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  12. Multi-channel high-speed CMOS image acquisition and pre-processing system

    NASA Astrophysics Data System (ADS)

    Sun, Chun-feng; Yuan, Feng; Ding, Zhen-liang

    2008-10-01

    A new multi-channel high-speed CMOS image acquisition and pre-processing system is designed to realize the image acquisition, data transmission, time sequential control and simple image processing by high-speed CMOS image sensor. The modular structure design, LVDS and ping-pong cache techniques used during the designed image data acquisition sub-system design ensure the real-time data acquisition and transmission. Furthermore, a new histogram equalization algorithm of adaptive threshold value based on the reassignment of redundant gray level is incorporated in the image preprocessing module of FPGA. The iterative method is used in the course of setting threshold value, and a redundant graylevel is redistributed rationally according to the proportional gray level interval. The over-enhancement of background is restrained and the feasibility of mergence of foreground details is reduced. The experimental certificates show that the system can be used to realize the image acquisition, transmission, memory and pre-processing to 590MPixels/s data size, and make for the design and realization of the subsequent system.

  13. Pulsed laser noise analysis and pump-probe signal detection with a data acquisition card.

    PubMed

    Werley, Christopher A; Teo, Stephanie M; Nelson, Keith A

    2011-12-01

    A photodiode and data acquisition card whose sampling clock is synchronized to the repetition rate of a laser are used to measure the energy of each laser pulse. Simple analysis of the data yields the noise spectrum from very low frequencies up to half the repetition rate and quantifies the pulse energy distribution. When two photodiodes for balanced detection are used in combination with an optical modulator, the technique is capable of detecting very weak pump-probe signals (ΔI/I(0) ~ 10(-5) at 1 kHz), with a sensitivity that is competitive with a lock-in amplifier. Detection with the data acquisition card is versatile and offers many advantages including full quantification of noise during each stage of signal processing, arbitrary digital filtering in silico after data collection is complete, direct readout of percent signal modulation, and easy adaptation for fast scanning of delay between pump and probe.

  14. Sensor Data Acquisition and Processing Parameters for Human Activity Classification

    PubMed Central

    Bersch, Sebastian D.; Azzi, Djamel; Khusainov, Rinat; Achumba, Ifeyinwa E.; Ries, Jana

    2014-01-01

    It is known that parameter selection for data sampling frequency and segmentation techniques (including different methods and window sizes) has an impact on the classification accuracy. For Ambient Assisted Living (AAL), no clear information to select these parameters exists, hence a wide variety and inconsistency across today's literature is observed. This paper presents the empirical investigation of different data sampling rates, segmentation techniques and segmentation window sizes and their effect on the accuracy of Activity of Daily Living (ADL) event classification and computational load for two different accelerometer sensor datasets. The study is conducted using an ANalysis Of VAriance (ANOVA) based on 32 different window sizes, three different segmentation algorithm (with and without overlap, totaling in six different parameters) and six sampling frequencies for nine common classification algorithms. The classification accuracy is based on a feature vector consisting of Root Mean Square (RMS), Mean, Signal Magnitude Area (SMA), Signal Vector Magnitude (here SMV), Energy, Entropy, FFTPeak, Standard Deviation (STD). The results are presented alongside recommendations for the parameter selection on the basis of the best performing parameter combinations that are identified by means of the corresponding Pareto curve. PMID:24599189

  15. Monitoring of HTS compound library quality via a high-resolution image acquisition and processing instrument.

    PubMed

    Baillargeon, Pierre; Scampavia, Louis; Einsteder, Ross; Hodder, Peter

    2011-06-01

    This report presents the high-resolution image acquisition and processing instrument for compound management applications (HIAPI-CM). The HIAPI-CM combines imaging spectroscopy and machine-vision analysis to perform rapid assessment of high-throughput screening (HTS) compound library quality. It has been customized to detect and classify typical artifacts found in HTS compound library microtiter plates (MTPs). These artifacts include (1) insufficient volume of liquid compound sample, (2) compound precipitation, and (3) colored compounds that interfere with HTS assay detection format readout. The HIAPI-CM is also configured to automatically query and compare its analysis results to data stored in a LIMS or corporate database, aiding in the detection of compound registration errors. To demonstrate its capabilities, several compound plates (n=5760 wells total) containing different artifacts were measured via automated HIAPI-CM analysis, and results compared with those obtained by manual (visual) inspection. In all cases, the instrument demonstrated high fidelity (99.8% empty wells; 100.1% filled wells; 94.4% for partially filled wells; 94.0% for wells containing colored compounds), and in the case of precipitate detection, the HIAPI-CM results significantly exceeded the fidelity of visual observations (220.0%). As described, the HIAPI-CM allows for noninvasive, nondestructive MTP assessment with a diagnostic throughput of about 1min per plate, reducing analytical expenses and improving the quality and stewardship of HTS compound libraries.

  16. A Future Vision of a Data Acquisition: Distributed Sensing, Processing, and Health Monitoring

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Solano, Wanda; Thurman, Charles; Schmalzel, John

    2000-01-01

    This paper presents a vision fo a highly enhanced data acquisition and health monitoring system at NASA Stennis Space Center (SSC) rocket engine test facility. This vision includes the use of advanced processing capabilities in conjunction with highly autonomous distributed sensing and intelligence, to monitor and evaluate the health of data in the context of it's associated process. This method is expected to significantly reduce data acquisitions costs and improve system reliability. A Universal Signal Conditioning Amplifier (USCA) based system, under development at Kennedy Space Center, is being evaluated for adaptation to the SSC testing infrastructure. Kennedy's USCA architecture offers many advantages including flexible and auto-configuring data acquisition with improved calibration and verifiability. Possible enhancements at SSC may include multiplexing the distributed USCAs to reduce per channel cost, and the use of IEEE-485 to Allen-Bradley Control Net Gateways for interfacing with the resident control systems.

  17. The electron spectroscopy for chemical analysis microscopy beamline data acquisition system at ELETTRA

    NASA Astrophysics Data System (ADS)

    Gariazzo, C.; Krempaska, R.; Morrison, G. R.

    1996-07-01

    The electron spectroscopy for chemical analysis (ESCA) microscopy data acquisition system enables the user to control the imaging and spectroscopy modes of operation of the beamline ESCA microscopy at ELETTRA. It allows the user to integrate all experiment, beamline and machine operations in one single environment. The system also provides simple data analysis for both spectra and images data to guide further data acquisition.

  18. Learning and Individual Differences: An Ability/Information-Processing Framework for Skill Acquisition. Final Report.

    ERIC Educational Resources Information Center

    Ackerman, Phillip L.

    A program of theoretical and empirical research focusing on the ability determinants of individual differences in skill acquisition is reviewed. An integrative framework for information-processing and cognitive ability determinants of skills is reviewed, along with principles for ability-skill relations. Experimental manipulations were used to…

  19. The Priority of Listening Comprehension over Speaking in the Language Acquisition Process

    ERIC Educational Resources Information Center

    Xu, Fang

    2011-01-01

    By elaborating the definition of listening comprehension, the characteristic of spoken discourse, the relationship between STM and LTM and Krashen's comprehensible input, the paper puts forward the point that the priority of listening comprehension over speaking in the language acquisition process is very necessary.

  20. Processes of Language Acquisition in Children with Autism: Evidence from Preferential Looking

    ERIC Educational Resources Information Center

    Swensen, Lauren D.; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R.

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency…

  1. Individual Variation in Infant Speech Processing: Implications for Language Acquisition Theories

    ERIC Educational Resources Information Center

    Cristia, Alejandrina

    2009-01-01

    To what extent does language acquisition recruit domain-general processing mechanisms? In this dissertation, evidence concerning this question is garnered from the study of individual differences in infant speech perception and their predictive value with respect to language development in early childhood. In the first experiment, variation in the…

  2. An Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Hurowitz, J. A.; Jandura, L.; Limonadi, D.

    2012-12-01

    The Mars Science Laboratory Mission (MSL), landed on Mars on August 5. The rover and a scientific payload are designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem is the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)). It is expected that the SA/SPaH system will have produced a scooped system and possibility a drilled sample in the first 90 sols of the mission. Results from these activities and the ongoing testing program will be presented.

  3. Transient Decline in Hippocampal Theta Activity during the Acquisition Process of the Negative Patterning Task

    PubMed Central

    Sakimoto, Yuya; Okada, Kana; Takeda, Kozue; Sakata, Shogo

    2013-01-01

    Hippocampal function is important in the acquisition of negative patterning but not of simple discrimination. This study examined rat hippocampal theta activity during the acquisition stages (early, middle, and late) of the negative patterning task (A+, B+, AB-). The results showed that hippocampal theta activity began to decline transiently (for 500 ms after non-reinforced stimulus presentation) during the late stage of learning in the negative patterning task. In addition, this transient decline in hippocampal theta activity in the late stage was lower in the negative patterning task than in the simple discrimination task. This transient decline during the late stage of task acquisition may be related to a learning process distinctive of the negative patterning task but not the simple discrimination task. We propose that the transient decline of hippocampal theta activity reflects inhibitory learning and/or response inhibition after the presentation of a compound stimulus specific to the negative patterning task. PMID:23936249

  4. Optimizing Federal Fleet Vehicle Acquisitions: An Eleven-Agency FY 2012 Analysis

    SciTech Connect

    Singer, M.; Daley, R.

    2015-02-01

    This report focuses on the National Renewable Energy Laboratory's (NREL) fiscal year (FY) 2012 effort that used the NREL Optimal Vehicle Acquisition (NOVA) analysis to identify optimal vehicle acquisition recommendations for eleven diverse federal agencies. Results of the study show that by following a vehicle acquisition plan that maximizes the reduction in greenhouse gas (GHG) emissions, significant progress is also made toward the mandated complementary goals of acquiring alternative fuel vehicles, petroleum use reduction, and alternative fuel use increase.

  5. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  6. Performance of a VME-based parallel processing LIDAR data acquisition system (summary)

    SciTech Connect

    Moore, K.; Buttler, B.; Caffrey, M.; Soriano, C.

    1995-05-01

    It may be possible to make accurate real time, autonomous, 2 and 3 dimensional wind measurements remotely with an elastic backscatter Light Detection and Ranging (LIDAR) system by incorporating digital parallel processing hardware into the data acquisition system. In this paper, we report the performance of a commercially available digital parallel processing system in implementing the maximum correlation technique for wind sensing using actual LIDAR data. Timing and numerical accuracy are benchmarked against a standard microprocessor impementation.

  7. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  8. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. PMID:26720258

  9. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing.

  10. Image analysis and data-acquisition techniques for infrared and CCD cameras for ATF

    NASA Astrophysics Data System (ADS)

    Young, K. G.; Hillis, D. L.

    1988-08-01

    A multipurpose image processing system has been developed for the Advanced Toroidal Facility (ATF) stellarator experiment. This system makes it possible to investigate the complicated topology inherent in stellarator plasmas with conventional video technology. Infrared (IR) and charge-coupled device (CCD) cameras, operated at the standard video framing rate, are used on ATF to measure heat flux patterns to the vacuum vessel wall and visible-light emission from the ionized plasma. These video cameras are coupled with fast acquisition and display systems, developed for a MicroVAX-II, which allow between-shot observation of the dynamic temperature and spatial extent of the plasma generated by ATF. The IR camera system provides acquisition of one frame of 60×80 eight-bit pixels every 16.7 ms via storage in a CAMAC module. The CCD data acquisition proceeds automatically, storing the video frames until its 12-bit, 1-Mbyte CAMAC memory is filled. After analysis, transformation, and compression, selected portions of the data are stored on disk. Interactive display of experimental data and theoretical calculations are performed with software written in Interactive Data Language.

  11. Micro-MRI-based image acquisition and processing system for assessing the response to therapeutic intervention

    NASA Astrophysics Data System (ADS)

    Vasilić, B.; Ladinsky, G. A.; Saha, P. K.; Wehrli, F. W.

    2006-03-01

    Osteoporosis is the cause of over 1.5 million bone fractures annually. Most of these fractures occur in sites rich in trabecular bone, a complex network of bony struts and plates found throughout the skeleton. The three-dimensional structure of the trabecular bone network significantly determines mechanical strength and thus fracture resistance. Here we present a data acquisition and processing system that allows efficient noninvasive assessment of trabecular bone structure through a "virtual bone biopsy". High-resolution MR images are acquired from which the trabecular bone network is extracted by estimating the partial bone occupancy of each voxel. A heuristic voxel subdivision increases the effective resolution of the bone volume fraction map and serves a basis for subsequent analysis of topological and orientational parameters. Semi-automated registration and segmentation ensure selection of the same anatomical location in subjects imaged at different time points during treatment. It is shown with excerpts from an ongoing clinical study of early post-menopausal women, that significant reduction in network connectivity occurs in the control group while the structural integrity is maintained in the hormone replacement group. The system described should be suited for large-scale studies designed to evaluate the efficacy of therapeutic intervention in subjects with metabolic bone disease.

  12. Meteoceanographic premises for structural design purposes in the Adriatic Sea: Acquisition and processing of data

    SciTech Connect

    Rampolli, M.; Biancardi, A.; Filippi, G. De

    1996-12-31

    In 1993 the leading international standards (ISO, APOI RP2A) for the design of offshore structures drastically changed the procedure for the definition of hydrodynamic forces. In particular oil companies are required to have a detailed knowledge of the weather of the areas where they operate, if they want to maintain the previous results. Alternatively, more conservative hydrodynamic forces must be considered in the design phase. Such an increase, valuable in 20--30% of total hydrodynamic force, means heavier platform structures in new projects, and more critical elements to be inspected in existing platforms. In 1992, in order to have more reliable and safe transports to and from the platforms, Agip installed a meteo-marine sensor network in Adriatic Sea, on 13 of the over 80 producing platforms. Data collected are sent to shore via radio and operators can use real time data or 12-hour wave forecast, obtained by a statistic forecasting model. Taking advantage by such existing instruments, a project was undertaken in 1993 with the purpose of determining the extreme environmental parameters to be used by structural engineers. The network has been upgraded in order to achieve directional information of the waves and to permit short term analysis. This paper describes the data acquisition system, data processing and the achieved results.

  13. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary. PMID:26930464

  14. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary.

  15. [Software development of multi-element transient signal acquisition and processing with multi-channel ICP-AES].

    PubMed

    Zhang, Y; Zhuang, Z; Wang, X; Zhu, E; Liu, J

    2000-02-01

    A software for multi-channel ICP-AES multi-element transient signal acquisition and processing were developed in this paper. It has been successfully applied to signal acquisition and processing in many transient introduction techniques on-line hyphenated with multi-channel ICP-AES.

  16. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  17. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants

    PubMed Central

    Navarro, Pedro J.; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  18. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-05-05

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation.

  19. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  20. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  1. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees.

  2. Real-time digital design for an optical coherence tomography acquisition and processing system

    NASA Astrophysics Data System (ADS)

    Ralston, Tyler S.; Mayen, Jose A.; Marks, Dan L.; Boppart, Stephen A.

    2004-07-01

    We present a real-time, multi-dimensional, digital, optical coherence tomography (OCT) acquisition and imaging system. The system consists of conventional OCT optics, a rapid scanning optical delay (RSOD) line to support fast data acquisition rates, and a high-speed A/D converter for sampling the interference waveforms. A 1M-gate Virtex-II field programmable gate array (FPGA) is designed to perform digital down conversion. This is analogous to demodulating and low-pass filtering the continuous time signal. The system creates in-phase and quadrature-phase components using a tunable quadrature mixer. Multistage polyphase finite impulse response (FIR) filtering and down sampling is used to remove unneeded high frequencies. A floating-point digital signal processor (DSP) computes the magnitude and phase shifts. The data is read by a host machine and displayed on screen at real-time rates commensurate with the data acquisition rate. This system offers flexible acquisition and processing parameters for a wide range of multi-dimensional optical microscopy techniques.

  3. Summary of the activities of the subgroup on data acquisition and processing

    SciTech Connect

    Connolly, P.L.; Doughty, D.C.; Elias, J.E.

    1981-01-01

    A data acquisition and handling subgroup consisting of approximately 20 members met during the 1981 ISABELLE summer study. Discussions were led by members of the BNL ISABELLE Data Acquisition Group (DAG) with lively participation from outside users. Particularly large contributions were made by representatives of BNL experiments 734, 735, and the MPS, as well as the Fermilab Colliding Detector Facility and the SLAC LASS Facility. In contrast to the 1978 study, the subgroup did not divide its activities into investigations of various individual detectors, but instead attempted to review the current state-of-the-art in the data acquisition, trigger processing, and data handling fields. A series of meetings first reviewed individual pieces of the problem, including status of the Fastbus Project, the Nevis trigger processor, the SLAC 168/E and 3081/E emulators, and efforts within DAG. Additional meetings dealt with the question involving specifying and building complete data acquisition systems. For any given problem, a series of possible solutions was proposed by the members of the subgroup. In general, any given solution had both advantages and disadvantages, and there was never any consensus on which approach was best. However, there was agreement that certain problems could only be handled by systems of a given power or greater. what will be given here is a review of various solutions with associated powers, costs, advantages, and disadvantages.

  4. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  5. Process to process communication over Fastbus in the data acquisition system of the ALEPH TPC

    SciTech Connect

    Lusiani, A. . Division PPE Scuola Normale Superiore, Pisa )

    1994-02-01

    The data acquisition system of the ALEPH TPC includes a VAX/VMS computer cluster and 36 intelligent Fastbus modules (ALEPH TPPS) running the OS9 multitasking real-time operating system. Dedicated software has been written in order to reliably exchange information over Fastbus between the VAX/VMS cluster and the 36 TPPs to initialize and co-ordinate the microprocessors, and to monitor and debug their operation. The functionality and the performance of this software will be presented together with an overview of the application that rely on it.

  6. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  7. Phonological processing in deaf signers and the impact of age of first language acquisition.

    PubMed

    MacSweeney, Mairéad; Waters, Dafydd; Brammer, Michael J; Woll, Bencie; Goswami, Usha

    2008-04-15

    Just as words can rhyme, the signs of a signed language can share structural properties, such as location. Linguistic description at this level is termed phonology. We report that a left-lateralised fronto-parietal network is engaged during phonological similarity judgements made in both English (rhyme) and British Sign Language (BSL; location). Since these languages operate in different modalities, these data suggest that the neural network supporting phonological processing is, to some extent, supramodal. Activation within this network was however modulated by language (BSL/English), hearing status (deaf/hearing), and age of BSL acquisition (native/non-native). The influence of language and hearing status suggests an important role for the posterior portion of the left inferior frontal gyrus in speech-based phonological processing in deaf people. This, we suggest, is due to increased reliance on the articulatory component of speech when the auditory component is absent. With regard to age of first language acquisition, non-native signers activated the left inferior frontal gyrus more than native signers during the BSL task, and also during the task performed in English, which both groups acquired late. This is the first neuroimaging demonstration that age of first language acquisition has implications not only for the neural systems supporting the first language, but also for networks supporting languages learned subsequently.

  8. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  9. Autonomous Closed-Loop Tasking, Acquisition, Processing, and Evaluation for Situational Awareness Feedback

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Dan; Cappelaere, Pat

    2016-01-01

    This presentation describes the closed loop satellite autonomy methods used to connect users and the assets on Earth Orbiter- 1 (EO-1) and similar satellites. The base layer is a distributed architecture based on Goddard Mission Services Evolution Concept (GMSEC) thus each asset still under independent control. Situational awareness is provided by a middleware layer through common Application Programmer Interface (API) to GMSEC components developed at GSFC. Users setup their own tasking requests, receive views into immediate past acquisitions in their area of interest, and into future feasibilities for acquisition across all assets. Automated notifications via pubsub feeds are returned to users containing published links to image footprints, algorithm results, and full data sets. Theme-based algorithms are available on-demand for processing.

  10. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  11. Users' perceptions of the impact of electronic aids to daily living throughout the acquisition process.

    PubMed

    Ripat, Jacquie; Strock, Anne

    2004-01-01

    This study investigated the experience of seven new users of a particular type of assistive technology through the stages of anticipating, acquiring, and using an electronic aid to daily living. A mixed methods research approach was used to explore each of these stages. The Psychosocial Impact of Assistive Devices Scale was used to measure the perceived impact of the new assistive technology on users' quality of life, and findings were further explored and developed through open-ended questioning of the participants. Results indicated that preacquisition of the device, users predicted that the electronic aid to daily living would have a positive impact on their feelings of competence and confidence and that the device would enable them in a positive way. One month after acquiring the device a reduced, yet still positive, impact was observed. By 3 and 6 months after acquisition, perceived impact returned to the same positive high level as preacquisition. It is suggested that prior to receiving the device, potential users have positive expectations for the device that are not based in experience. At the early acquisition time, users adjust expectations of the role of the assistive technology in their lives and strive to balance expectations with reality. Three to 6 months after acquiring an electronic aid to daily living, the participants have a high positive view of how the device impacts on their lives based in experience and reality. A model illustrating the electronic aids to daily living acquisition process is proposed, and suggestions for future study are provided.

  12. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  13. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees. PMID:19727866

  14. IECON '87: Signal acquisition and processing; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Astrophysics Data System (ADS)

    Niederjohn, Russell J.

    1987-01-01

    Theoretical and applications aspects of signal processing are examined in reviews and reports. Topics discussed include speech processing methods, algorithms, and architectures; signal-processing applications in motor and power control; digital signal processing; signal acquisition and analysis; and processing algorithms and applications. Consideration is given to digital coding of speech algorithms, an algorithm for continuous-time processes in discrete-time measurement, quantization noise and filtering schemes for digital control systems, distributed data acquisition for biomechanics research, a microcomputer-based differential distance and velocity measurement system, velocity observations from discrete position encoders, a real-time hardware image preprocessor, and recognition of partially occluded objects by a knowledge-based system.

  15. Maintenance Process Strategic Analysis

    NASA Astrophysics Data System (ADS)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  16. Signal Processing, Analysis, & Display

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible andmore » are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  17. Problem solving in nursing practice: application, process, skill acquisition and measurement.

    PubMed

    Roberts, J D; While, A E; Fitzpatrick, J M

    1993-06-01

    This paper analyses the role of problem solving in nursing practice including the process, acquisition and measurement of problem-solving skills. It is argued that while problem-solving ability is acknowledged as critical if today's nurse practitioner is to maintain effective clinical practice, to date it retains a marginal place in nurse education curricula. Further, it has attracted limited empirical study. Such an omission, it is argued, requires urgent redress if the nursing profession is to meet effectively the challenges of the next decade and beyond.

  18. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  19. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  20. Analysis of patient movement during 3D USCT data acquisition

    NASA Astrophysics Data System (ADS)

    Ruiter, N. V.; Hopp, T.; Zapf, M.; Kretzek, E.; Gemmeke, H.

    2016-04-01

    In our first clinical study with a full 3D Ultrasound Computer Tomography (USCT) system patient data was acquired in eight minutes for one breast. In this paper the patient movement during the acquisition was analyzed quantitatively and as far as possible corrected in the resulting images. The movement was tracked in ten successive reflectivity reconstructions of full breast volumes acquired during 10 s intervals at different aperture positions, which were separated by 41 s intervals. The mean distance between initial and final position was 2.2 mm (standard deviation (STD) +/- 0.9 mm, max. 4.1 mm, min. 0.8 mm) and the average sum of all moved distances was 4.9 mm (STD +/- 1.9 mm, max. 8.8 mm, min. 2.7 mm). The tracked movement was corrected by summing successive images, which were transformed according to the detected movement. The contrast of these images increased and additional image content became visible.

  1. Liquid crystal materials and structures for image processing and 3D shape acquisition

    NASA Astrophysics Data System (ADS)

    Garbat, K.; Garbat, P.; Jaroszewicz, L.

    2012-03-01

    The image processing supported by liquid crystals device has been used in numerous imaging applications, including polarization imaging, digital holography and programmable imaging. Liquid crystals have been extensively studied and are massively used in display and optical processing technology. We present here the main relevant parameters of liquid crystal for image processing and 3D shape acquisition and we compare the main liquid crystal options which can be used with their respective advantages. We propose here to compare performance of several types of liquid crystal materials: nematic mixtures with high and medium optical and dielectrical anisotropies and relatively low rotational viscosities nematic materials which may operate in TN mode in mono and dual frequency addressing systems.

  2. An Integrated Data Acquisition / User Request/ Processing / Delivery System for Airborne Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Chapman, B.; Chu, A.; Tung, W.

    2003-12-01

    Airborne science data has historically played an important role in the development of the scientific underpinnings for spaceborne missions. When the science community determines the need for new types of spaceborne measurements, airborne campaigns are often crucial in risk mitigation for these future missions. However, full exploitation of the acquired data may be difficult due to its experimental and transitory nature. Externally to the project, most problematic (in particular, for those not involved in requesting the data acquisitions) may be the difficulty in searching for, requesting, and receiving the data, or even knowing the data exist. This can result in a rather small, insular community of users for these data sets. Internally, the difficulty for the project is in maintaining a robust processing and archival system during periods of changing mission priorities and evolving technologies. The NASA/JPL Airborne Synthetic Aperture Radar (AIRSAR) has acquired data for a large and varied community of scientists and engineers for 15 years. AIRSAR is presently supporting current NASA Earth Science Enterprise experiments, such as the Soil Moisture EXperiment (SMEX) and the Cold Land Processes experiment (CLPX), as well as experiments conducted as many as 10 years ago. During that time, it's processing, data ordering, and data delivery system has undergone evolutionary change as the cost and capability of resources has improved. AIRSAR now has a fully integrated data acquisition/user request/processing/delivery system through which most components of the data fulfillment process communicate via shared information within a database. The integration of these functions has reduced errors and increased throughput of processed data to customers.

  3. A Flexible Software Platform for Fast-Scan Cyclic Voltammetry Data Acquisition and Analysis

    PubMed Central

    Bucher, Elizabeth S.; Brooks, Kenneth; Verber, Matthew D.; Keithley, Richard B.; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J.; Wightman, R. Mark

    2013-01-01

    Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite, and includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with TTL-based systems that are used for monitoring animal behavior and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both sub-second and multi-minute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments. PMID:24083898

  4. Flexible software platform for fast-scan cyclic voltammetry data acquisition and analysis.

    PubMed

    Bucher, Elizabeth S; Brooks, Kenneth; Verber, Matthew D; Keithley, Richard B; Owesson-White, Catarina; Carroll, Susan; Takmakov, Pavel; McKinney, Collin J; Wightman, R Mark

    2013-11-01

    Over the last several decades, fast-scan cyclic voltammetry (FSCV) has proved to be a valuable analytical tool for the real-time measurement of neurotransmitter dynamics in vitro and in vivo. Indeed, FSCV has found application in a wide variety of disciplines including electrochemistry, neurobiology, and behavioral psychology. The maturation of FSCV as an in vivo technique led users to pose increasingly complex questions that require a more sophisticated experimental design. To accommodate recent and future advances in FSCV application, our lab has developed High Definition Cyclic Voltammetry (HDCV). HDCV is an electrochemical software suite that includes data acquisition and analysis programs. The data collection program delivers greater experimental flexibility and better user feedback through live displays. It supports experiments involving multiple electrodes with customized waveforms. It is compatible with transistor-transistor logic-based systems that are used for monitoring animal behavior, and it enables simultaneous recording of electrochemical and electrophysiological data. HDCV analysis streamlines data processing with superior filtering options, seamlessly manages behavioral events, and integrates chemometric processing. Furthermore, analysis is capable of handling single files collected over extended periods of time, allowing the user to consider biological events on both subsecond and multiminute time scales. Here we describe and demonstrate the utility of HDCV for in vivo experiments. PMID:24083898

  5. Chemical Sensing in Process Analysis.

    ERIC Educational Resources Information Center

    Hirschfeld, T.; And Others

    1984-01-01

    Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)

  6. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  7. Visual Skills and Chinese Reading Acquisition: A Meta-Analysis of Correlation Evidence

    ERIC Educational Resources Information Center

    Yang, Ling-Yan; Guo, Jian-Peng; Richman, Lynn C.; Schmidt, Frank L.; Gerken, Kathryn C.; Ding, Yi

    2013-01-01

    This paper used meta-analysis to synthesize the relation between visual skills and Chinese reading acquisition based on the empirical results from 34 studies published from 1991 to 2011. We obtained 234 correlation coefficients from 64 independent samples, with a total of 5,395 participants. The meta-analysis revealed that visual skills as a…

  8. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  9. A high speed data acquisition and analysis system for transonic velocity, density, and total temperature fluctuations

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1988-01-01

    The high speed Dynamic Data Acquisition System (DDAS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAS replaces both a recording mechanism and a separate data processing system. The data acquisition and data reduction process has been combined within DDAS. DDAS receives input from hot wires and anemometers, amplifies and filters the signals with computer controlled modules, and converts the analog signals to digital with real-time simultaneous digitization followed by digital recording on disk or tape. Automatic acquisition (either from a computer link to an existing wind tunnel acquisition system, or from data acquisition facilities within DDAS) collects necessary calibration and environment data. The generation of hot wire sensitivities is done in DDAS, as is the application of sensitivities to the hot wire data to generate turbulence quantities. The presentation of the raw and processed data, in terms of root mean square values of velocity, density and temperature, and the processing of the spectral data is accomplished on demand in near-real-time- with DDAS. A comprehensive description of the interface to the DDAS and of the internal mechanisms will be prosented. A summary of operations relevant to the use of the DDAS will be provided.

  10. A review of breast tomosynthesis. Part I. The image acquisition process

    SciTech Connect

    Sechopoulos, Ioannis

    2013-01-15

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

  11. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach.

  12. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach. PMID:27561252

  13. Acquisition of material properties in production for sheet metal forming processes

    SciTech Connect

    Heingärtner, Jörg; Hora, Pavel; Neumann, Anja; Hortig, Dirk; Rencki, Yasar

    2013-12-16

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  14. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  15. The Influence of Working Memory and Phonological Processing on English Language Learner Children's Bilingual Reading and Language Acquisition

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Orosco, Michael J.; Lussier, Cathy M.; Gerber, Michael M.; Guzman-Orth, Danielle A.

    2011-01-01

    In this study, we explored whether the contribution of working memory (WM) to children's (N = 471) 2nd language (L2) reading and language acquisition was best accounted for by processing efficiency at a phonological level and/or by executive processes independent of phonological processing. Elementary school children (Grades 1, 2, & 3) whose 1st…

  16. Cognitive processes during fear acquisition and extinction in animals and humans: implications for exposure therapy of anxiety disorders.

    PubMed

    Hofmann, Stefan G

    2008-02-01

    Anxiety disorders are highly prevalent. Fear conditioning and extinction learning in animals often serve as simple models of fear acquisition and exposure therapy of anxiety disorders in humans. This article reviews the empirical and theoretical literature on cognitive processes in fear acquisition, extinction, and exposure therapy. It is concluded that exposure therapy is a form of cognitive intervention that specifically changes the expectancy of harm. Implications for therapy research are discussed.

  17. A Rational Analysis of the Acquisition of Multisensory Representations

    ERIC Educational Resources Information Center

    Yildirim, Ilker; Jacobs, Robert A.

    2012-01-01

    How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory…

  18. Accelerating Data Acquisition, Reduction, and Analysis at the Spallation Neutron Source

    SciTech Connect

    Campbell, Stuart I; Kohl, James Arthur; Granroth, Garrett E; Miller, Ross G; Doucet, Mathieu; Stansberry, Dale V; Proffen, Thomas E; Taylor, Russell J; Dillow, David

    2014-01-01

    ORNL operates the world's brightest neutron source, the Spallation Neutron Source (SNS). Funded by the US DOE Office of Basic Energy Science, this national user facility hosts hundreds of scientists from around the world, providing a platform to enable break-through research in materials science, sustainable energy, and basic science. While the SNS provides scientists with advanced experimental instruments, the deluge of data generated from these instruments represents both a big data challenge and a big data opportunity. For example, instruments at the SNS can now generate multiple millions of neutron events per second providing unprecedented experiment fidelity but leaving the user with a dataset that cannot be processed and analyzed in a timely fashion using legacy techniques. To address this big data challenge, ORNL has developed a near real-time streaming data reduction and analysis infrastructure. The Accelerating Data Acquisition, Reduction, and Analysis (ADARA) system provides a live streaming data infrastructure based on a high-performance publish subscribe system, in situ data reduction, visualization, and analysis tools, and integration with a high-performance computing and data storage infrastructure. ADARA allows users of the SNS instruments to analyze their experiment as it is run and make changes to the experiment in real-time and visualize the results of these changes. In this paper we describe ADARA, provide a high-level architectural overview of the system, and present a set of use-cases and real-world demonstrations of the technology.

  19. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  20. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  1. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; Slimko, Eric; Limonadi, Daniel

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  2. Horizon Acquisition for Attitude Determination Using Image Processing Algorithms- Results of HORACE on REXUS 16

    NASA Astrophysics Data System (ADS)

    Barf, J.; Rapp, T.; Bergmann, M.; Geiger, S.; Scharf, A.; Wolz, F.

    2015-09-01

    The aim of the Horizon Acquisition Experiment (HORACE) was to prove a new concept for a two-axis horizon sensor using algorithms processing ordinary images, which is also operable at high spinning rates occurring during emergencies. The difficulty to cope with image distortions, which is avoided by conventional horizon sensors, was introduced on purpose as we envision a system being capable of using any optical data. During the flight on REXUS1 16, which provided a suitable platform similar to the future application scenario, a malfunction of the payload cameras caused severe degradation of the collected scientific data. Nevertheless, with the aid of simulations we could show that the concept is accurate (±0.6°), fast (~ lOOms/frame) and robust enough for coarse attitude determination during emergencies and also applicable for small satellites. Besides, technical knowledge regarding the design of REXUS-experiments, including the detection of interferences between SATA and GPS, was gained.

  3. Collecting Samples in Gale Crater, Mars; an Overview of the Mars Science Laboratory Sample Acquisition, Sample Processing and Handling System

    NASA Astrophysics Data System (ADS)

    Anderson, R. C.; Jandura, L.; Okon, A. B.; Sunshine, D.; Roumeliotis, C.; Beegle, L. W.; Hurowitz, J.; Kennedy, B.; Limonadi, D.; McCloskey, S.; Robinson, M.; Seybold, C.; Brown, K.

    2012-09-01

    The Mars Science Laboratory Mission (MSL), scheduled to land on Mars in the summer of 2012, consists of a rover and a scientific payload designed to identify and assess the habitability, geological, and environmental histories of Gale crater. Unraveling the geologic history of the region and providing an assessment of present and past habitability requires an evaluation of the physical and chemical characteristics of the landing site; this includes providing an in-depth examination of the chemical and physical properties of Martian regolith and rocks. The MSL Sample Acquisition, Processing, and Handling (SA/SPaH) subsystem will be the first in-situ system designed to acquire interior rock and soil samples from Martian surface materials. These samples are processed and separated into fine particles and distributed to two onboard analytical science instruments SAM (Sample Analysis at Mars Instrument Suite) and CheMin (Chemistry and Mineralogy) or to a sample analysis tray for visual inspection. The SA/SPaH subsystem is also responsible for the placement of the two contact instruments, Alpha Particle X-Ray Spectrometer (APXS), and the Mars Hand Lens Imager (MAHLI), on rock and soil targets. Finally, there is a Dust Removal Tool (DRT) to remove dust particles from rock surfaces for subsequent analysis by the contact and or mast mounted instruments (e.g. Mast Cameras (MastCam) and the Chemistry and Micro-Imaging instruments (ChemCam)).

  4. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  5. Towards an Operant Analysis of the Acquisition of Conceptual Behavior.

    ERIC Educational Resources Information Center

    Brigham, Thomas A.

    A model for the analysis of simple human conceptual behavior, based on the apparent similarities of human conceptual behavior and that of infrahuman subjects, is developed. A minimum definition of conceptual behavior is given: A single response, verbal or nonverbal, under the discriminative control of a group of stimuli whose parameters are…

  6. A Stylistic Approach to Foreign Language Acquisition and Literary Analysis.

    ERIC Educational Resources Information Center

    Berg, William J.; Martin-Berg, Laurey K.

    This paper discusses an approach to teaching third college year "bridge" courses, showing that students in a course that focuses on language and culture as well as students in an introductory course on literary analysis can benefit from using a stylistic approach to literacy texts to understand both form and content. The paper suggests that a…

  7. How does the interaction between spelling and motor processes build up during writing acquisition?

    PubMed

    Kandel, Sonia; Perret, Cyril

    2015-03-01

    How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent.

  8. Time series analysis of knowledge of results effects during motor skill acquisition.

    PubMed

    Blackwell, J R; Simmons, R W; Spray, J A

    1991-03-01

    Time series analysis was used to investigate the hypothesis that during acquisition of a motor skill, knowledge of results (KR) information is used to generate a stable internal referent about which response errors are randomly distributed. Sixteen subjects completed 50 acquisition trials of each of three movements whose spatial-temporal characteristics differed. Acquisition trials were either blocked, with each movement being presented in series, or randomized, with the presentation of movements occurring in random order. Analysis of movement time data indicated the contextual interference effect reported in previous studies was replicated in the present experiment. Time series analysis of the acquisition trial data revealed the majority of individual subject response patterns during blocked trials were best described by a model with a temporarily stationary, internal reference of the criterion and systematic, trial-to-trial variation of response errors. During random trial conditions, response patterns were usually best described by a "White-noise" model. This model predicts a permanently stationary, internal reference associated with randomly distributed response errors that are unaffected by KR information. These results are not consistent with previous work using time series analysis to describe motor behavior (Spray & Newell, 1986). PMID:2028084

  9. Automated acquisition and analysis of small angle X-ray scattering data

    NASA Astrophysics Data System (ADS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-10-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  10. Troubleshooting digital macro photography for image acquisition and the analysis of biological samples.

    PubMed

    Liepinsh, Edgars; Kuka, Janis; Dambrova, Maija

    2013-01-01

    For years, image acquisition and analysis have been an important part of life science experiments to ensure the adequate and reliable presentation of research results. Since the development of digital photography and digital planimetric methods for image analysis approximately 20 years ago, new equipment and technologies have emerged, which have increased the quality of image acquisition and analysis. Different techniques are available to measure the size of stained tissue samples in experimental animal models of disease; however, the most accurate method is digital macro photography with software that is based on planimetric analysis. In this study, we described the methodology for the preparation of infarcted rat heart and brain tissue samples before image acquisition, digital macro photography techniques and planimetric image analysis. These methods are useful in the macro photography of biological samples and subsequent image analysis. In addition, the techniques that are described in this study include the automated analysis of digital photographs to minimize user input and exclude the risk of researcher-generated errors or bias during image analysis.

  11. The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS)

    SciTech Connect

    Uncapher, W.L.; Arviso, M.

    1995-12-31

    Packages designed to transport radioactive materials are required to survive exposure to environments defined in Code of Federal Regulations. Cask designers can investigate package designs through structural and thermal testing of full-scale packages, components, or representative models. The acquisition of reliable response data from instrumentation measurement devices is an essential part of this testing activity. Sandia National Laboratories, under the sponsorship of the US Department of Energy (DOE), has developed the Mobile Instrumentation Data Acquisition System (MIDAS) dedicated to the collection and processing of structural and thermal data from regulatory tests.

  12. Web-based data acquisition and management system for GOSAT validation Lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra N.; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2012-11-01

    An web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data analysis is developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS ground-level meteorological data, Rawinsonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data.

  13. Acquisition and analysis of primate physiologic data for the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Eberhart, Russell C.; Hogrefe, Arthur F.; Radford, Wade E.; Sanders, Kermit H.; Dobbins, Roy W.

    1988-01-01

    This paper describes the design and prototypes of the Physiologic Acquisition and Telemetry System (PATS), which is a multichannel system, designed for large primates, for the data acquisition, telemetry, storage, and analysis of physiological data. PATS is expected to acquire data from units implanted in the abdominal cavities of rhesus monkeys that will be flown aboard the Spacelab. The system will telemeter both stored and real-time internal physiologic measurements to an external Flight Support Station (FSS) computer system. The implanted Data Acquition and Telemetry Subsystem subunit will be externally activated, controlled and reprogrammed from the FSS.

  14. Acquisition and analysis of primate physiologic data for the Space Shuttle

    NASA Astrophysics Data System (ADS)

    Eberhart, Russell C.; Hogrefe, Arthur F.; Radford, Wade E.; Sanders, Kermit H.; Dobbins, Roy W.

    1988-03-01

    This paper describes the design and prototypes of the Physiologic Acquisition and Telemetry System (PATS), which is a multichannel system, designed for large primates, for the data acquisition, telemetry, storage, and analysis of physiological data. PATS is expected to acquire data from units implanted in the abdominal cavities of rhesus monkeys that will be flown aboard the Spacelab. The system will telemeter both stored and real-time internal physiologic measurements to an external Flight Support Station (FSS) computer system. The implanted Data Acquition and Telemetry Subsystem subunit will be externally activated, controlled and reprogrammed from the FSS.

  15. Automated acquisition and analysis of airway surface liquid height by confocal microscopy

    PubMed Central

    Choi, Hyun-Chul; Kim, Christine Seul Ki

    2015-01-01

    The airway surface liquid (ASL) is a thin-liquid layer that lines the luminal side of airway epithelia. ASL contains many molecules that are involved in primary innate defense in the lung. Measurement of ASL height on primary airway cultures by confocal microscopy is a powerful tool that has enabled researchers to study ASL physiology and pharmacology. Previously, ASL image acquisition and analysis were performed manually. However, this process is time and labor intensive. To increase the throughput, we have developed an automatic ASL measurement technique that combines a fully automated confocal microscope with novel automatic image analysis software that was written with image processing techniques derived from the computer science field. We were able to acquire XZ ASL images at the rate of ∼1 image/s in a reproducible fashion. Our automatic analysis software was able to analyze images at the rate of ∼32 ms/image. As proofs of concept, we generated a time course for ASL absorption and a dose response in the presence of SPLUNC1, a known epithelial sodium channel inhibitor, on human bronchial epithelial cultures. Using this approach, we determined the IC50 for SPLUNC1 to be 6.53 μM. Furthermore, our technique successfully detected a difference in ASL height between normal and cystic fibrosis (CF) human bronchial epithelial cultures and detected changes in ATP-stimulated Cl−/ASL secretion. We conclude that our automatic ASL measurement technique can be applied for repeated ASL height measurements with high accuracy and consistency and increased throughput. PMID:26001773

  16. Differentiating semantic categories during the acquisition of novel words: correspondence analysis applied to event-related potentials.

    PubMed

    Fargier, Raphaël; Ploux, Sabine; Cheylus, Anne; Reboul, Anne; Paulignan, Yves; Nazir, Tatjana A

    2014-11-01

    Growing evidence suggests that semantic knowledge is represented in distributed neural networks that include modality-specific structures. Here, we examined the processes underlying the acquisition of words from different semantic categories to determine whether the emergence of visual- and action-based categories could be tracked back to their acquisition. For this, we applied correspondence analysis (CA) to ERPs recorded at various moments during acquisition. CA is a multivariate statistical technique typically used to reveal distance relationships between words of a corpus. Applied to ERPs, it allows isolating factors that best explain variations in the data across time and electrodes. Participants were asked to learn new action and visual words by associating novel pseudowords with the execution of hand movements or the observation of visual images. Words were probed before and after training on two consecutive days. To capture processes that unfold during lexical access, CA was applied on the 100-400 msec post-word onset interval. CA isolated two factors that organized the data as a function of test sessions and word categories. Conventional ERP analyses further revealed a category-specific increase in the negativity of the ERPs to action and visual words at the frontal and occipital electrodes, respectively. The distinct neural processes underlying action and visual words can thus be tracked back to the acquisition of word-referent relationships and may have its origin in association learning. Given current evidence for the flexibility of language-induced sensory-motor activity, we argue that these associative links may serve functions beyond word understanding, that is, the elaboration of situation models.

  17. Touch sensing analysis using multi-modal acquisition system

    NASA Astrophysics Data System (ADS)

    King, Jeffrey S.; Pikula, Dragan; Baharav, Zachi

    2013-03-01

    Touch sensing is ubiquitous in many consumer electronic products. Users are expecting to be able to touch with their finger the surface of a display and interact with it. Yet, the actual mechanics and physics of the touch process are little known, as these are dependent on many independent variables. Ranging from the physics of the fingertip structure, composed of ridges, valleys, and pores, and beyond a few layers of skin and flesh the bone itself. Moreover, sweat glands and wetting are critical as well as we will see. As for the mechanics, the pressure at which one touches the screen, and the manner by which the surfaces responds to this pressure, have major impact on the touch sensing. In addition, different touch sensing methods, like capacitive or optical, will have different dependencies. For example, the color of the finger might impact the latter, whereas the former is insensitive to it. In this paper we describe a system that captures multiple modalities of the touch event, and by post-processing synchronizing all these. This enables us to look for correlation between various effects, and uncover their influence on the performance of the touch sensing algorithms. Moreover, investigating these relations allows us to improve various sensing algorithms, as well as find areas where they complement each other. We conclude by pointing to possible future extensions and applications of this system.

  18. Spectral analysis for automated exploration and sample acquisition

    NASA Technical Reports Server (NTRS)

    Eberlein, Susan; Yates, Gigi

    1992-01-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  19. Spectral analysis for automated exploration and sample acquisition

    NASA Astrophysics Data System (ADS)

    Eberlein, Susan; Yates, Gigi

    1992-05-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  20. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  1. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGES

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  2. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    SciTech Connect

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  3. Dynamic analysis of process reactors

    SciTech Connect

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  4. How human resource organization can enhance space information acquisition and processing: the experience of the VENESAT-1 ground segment

    NASA Astrophysics Data System (ADS)

    Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica

    2009-12-01

    Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM

  5. On the resolution of ECG acquisition systems for the reliable analysis of the P-wave.

    PubMed

    Censi, Federica; Calcagnini, Giovanni; Corazza, Ivan; Mattei, Eugenio; Triventi, Michele; Bartolini, Pietro; Boriani, Giuseppe

    2012-02-01

    The analysis of the P-wave on surface ECG is widely used to assess the risk of atrial arrhythmias. In order to provide reliable results, the automatic analysis of the P-wave must be precise and reliable and must take into account technical aspects, one of those being the resolution of the acquisition system. The aim of this note is to investigate the effects of the amplitude resolution of ECG acquisition systems on the P-wave analysis. Starting from ECG recorded by an acquisition system with a less significant bit (LSB) of 31 nV (24 bit on an input range of 524 mVpp), we reproduced an ECG signal as acquired by systems with lower resolution (16, 15, 14, 13 and 12 bit). We found that, when the LSB is of the order of 128 µV (12 bit), a single P-wave is not recognizable on ECG. However, when averaging is applied, a P-wave template can be extracted, apparently suitable for the P-wave analysis. Results obtained in terms of P-wave duration and morphology revealed that the analysis of ECG at lowest resolutions (from 12 to 14 bit, LSB higher than 30 µV) could lead to misleading results. However, the resolution used nowadays in modern electrocardiographs (15 and 16 bit, LSB <10 µV) is sufficient for the reliable analysis of the P-wave.

  6. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  7. Acquisition and analysis of angle-beam wavefield data

    SciTech Connect

    Dawson, Alexander J.; Michaels, Jennifer E.; Levine, Ross M.; Chen, Xin; Michaels, Thomas E.

    2014-02-18

    Angle-beam ultrasonic testing is a common practical technique used for nondestructive evaluation to detect, locate, and characterize a variety of material defects and damage. Greater understanding of the both the incident wavefield produced by an angle-beam transducer and the subsequent scattering from a variety of defects and geometrical features is anticipated to increase the reliability of data interpretation. The focus of this paper is on acquiring and analyzing propagating waves from angle-beam transducers in simple, defect-free plates as a first step in the development of methods for flaw characterization. Unlike guided waves, which excite the plate throughout its thickness, angle-beam bulk waves bounce back and forth between the plate surfaces, resulting in the well-known multiple “skips” or “V-paths.” The experimental setup consists of a laser vibrometer mounted on an XYZ scanning stage, which is programmed to move point-to-point on a rectilinear grid to acquire waveform data. Although laser vibrometry is now routinely used to record guided waves for which the frequency content is below 1 MHz, it is more challenging to acquire higher frequency bulk waves in the 1–10 MHz range. Signals are recorded on the surface of an aluminum plate that were generated from a 5 MHz, 65° refracted angle, shear wave transducer-wedge combination. Data are analyzed directly in the x-t domain, via a slant stack Radon transform in the τ-p (offset time-slowness) domain, and via a 2-D Fourier transform in the ω-k domain, thereby enabling identification of specific arrivals and modes. Results compare well to those expected from a simple ray tracing analysis except for the unexpected presence of a strong Rayleigh wave.

  8. Processing of syllable stress is functionally different from phoneme processing and does not profit from literacy acquisition.

    PubMed

    Schild, Ulrike; Becker, Angelika B C; Friedrich, Claudia K

    2014-01-01

    Speech is characterized by phonemes and prosody. Neurocognitive evidence supports the separate processing of each type of information. Therefore, one might suggest individual development of both pathways. In this study, we examine literacy acquisition in middle childhood. Children become aware of the phonemes in speech at that time and refine phoneme processing when they acquire an alphabetic writing system. We test whether an enhanced sensitivity to phonemes in middle childhood extends to other aspects of the speech signal, such as prosody. To investigate prosodic processing, we used stress priming. Spoken stressed and unstressed syllables (primes) preceded spoken German words with stress on the first syllable (targets). We orthogonally varied stress overlap and phoneme overlap between the primes and onsets of the targets. Lexical decisions and Event-Related Potentials (ERPs) for the targets were obtained for pre-reading preschoolers, reading pupils and adults. The behavioral and ERP results were largely comparable across all groups. The fastest responses were observed when the first syllable of the target word shared stress and phonemes with the preceding prime. ERP stress priming and ERP phoneme priming started 200 ms after the target word onset. Bilateral ERP stress priming was characterized by enhanced ERP amplitudes for stress overlap. Left-lateralized ERP phoneme priming replicates previously observed reduced ERP amplitudes for phoneme overlap. Groups differed in the strength of the behavioral phoneme priming and in the late ERP phoneme priming effect. The present results show that enhanced phonological processing in middle childhood is restricted to phonemes and does not extend to prosody. These results are indicative of two parallel processing systems for phonemes and prosody that might follow different developmental trajectories in middle childhood as a function of alphabetic literacy. PMID:24917838

  9. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  10. Standardization of infrared breast thermogram acquisition protocols and abnormality analysis of breast thermograms

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Gogoi, Usha Rani; Das, Kakali; Ghosh, Anjan Kumar; Bhattacharjee, Debotosh; Majumdar, Gautam

    2016-05-01

    The non-invasive, painless, radiation-free and cost-effective infrared breast thermography (IBT) makes a significant contribution to improving the survival rate of breast cancer patients by early detecting the disease. This paper presents a set of standard breast thermogram acquisition protocols to improve the potentiality and accuracy of infrared breast thermograms in early breast cancer detection. By maintaining all these protocols, an infrared breast thermogram acquisition setup has been established at the Regional Cancer Centre (RCC) of Government Medical College (AGMC), Tripura, India. The acquisition of breast thermogram is followed by the breast thermogram interpretation, for identifying the presence of any abnormality. However, due to the presence of complex vascular patterns, accurate interpretation of breast thermogram is a very challenging task. The bilateral symmetry of the thermal patterns in each breast thermogram is quantitatively computed by statistical feature analysis. A series of statistical features are extracted from a set of 20 thermograms of both healthy and unhealthy subjects. Finally, the extracted features are analyzed for breast abnormality detection. The key contributions made by this paper can be highlighted as -- a) the designing of a standard protocol suite for accurate acquisition of breast thermograms, b) creation of a new breast thermogram dataset by maintaining the protocol suite, and c) statistical analysis of the thermograms for abnormality detection. By doing so, this proposed work can minimize the rate of false findings in breast thermograms and thus, it will increase the utilization potentiality of breast thermograms in early breast cancer detection.

  11. Proposed military handbook for dynamic data acquisition and analysis - An invitation to review

    NASA Technical Reports Server (NTRS)

    Himelblau, Harry; Wise, James H.; Piersol, Allan G.; Grundvig, Max R.

    1990-01-01

    A draft Military Handbook prepared under the sponsorship of the USAF Space Division is presently being distributed throughout the U.S. for review by the aerospace community. This comprehensive document provides recommended guidelines for the acquisition and analysis of structural dynamics and aeroacoustic data, and is intended to reduce the errors and variability commonly found in flight, ground and laboratory dynamic test measurements. In addition to the usual variety of measurement problems encountered in the definition of dynamic loads, the development of design and test criteria, and the analysis of failures, special emphasis is given to certain state-of-the-art topics, such as pyroshock data acquisition and nonstationary random data analysis.

  12. Identification of phosphorus deficiency responsive proteins in a high phosphorus acquisition soybean (Glycine max) cultivar through proteomic analysis.

    PubMed

    Sha, Aihua; Li, Ming; Yang, Pingfang

    2016-05-01

    As one of the major oil crops, soybean might be seriously affected by phosphorus deficiency on both yield and quality. Understanding the molecular basis of phosphorus uptake and utilization in soybean may help to develop phosphorus (P) efficient cultivars. On this purpose, we conducted a comparative proteomic analysis on a high P acquisition soybean cultivar BX10 under low and high P conditions. A total of 61 unique proteins were identified as putative P deficiency responsive proteins. These proteins were involved in carbohydrate metabolism, protein biosynthesis/processing, energy metabolism, cellular processes, environmental defense/interaction, nucleotide metabolism, signal transduction, secondary metabolism and other metabolism related processes. Several proteins involved in energy metabolism, cellular processes, and protein biosynthesis and processing were found to be up-regulated in both shoots and roots, whereas, proteins involved in carbohydrate metabolism appeared to be down-regulated. These proteins are potential candidates for improving P acquisition. These findings provide a useful starting point for further research that will provide a more comprehensive understanding of molecular mechanisms through which soybeans adapt to P deficiency condition.

  13. Data acquisition and analysis of the UNCOSS underwater explosive neutron sensor

    SciTech Connect

    Carasco, Cedric; Eleon, Cyrille; Perot, Bertrand; Boudergui, Karim; Kondrasovs, Vladimir; Corre, Gwenole; Normand, Stephane; Sannie, Guillaume; Woo, Romuald; Bourbotte, Jean-Michel

    2012-08-15

    The purpose of the FP7 UNCOSS project (Underwater Coastal Sea Surveyor, http://www.uncoss-project.org) is to develop a neutron-based underwater explosive sensor to detect unexploded ordnance lying on the sea bottom. The Associated Particle Technique is used to focus the inspection on a suspicious object located by optical and electromagnetic sensors and to determine if there is an explosive charge inside. This paper presents the data acquisition electronics and data analysis software which have been developed for this project. A field programmable gate array that digitizes and processes the signal allows to perform precise time-of-flight and gamma-ray energy measurements. The gamma-ray spectra are unfolded into pure elemental count proportions, mainly C, N, O, Fe, Al, Si, and Ca. The C, N, and O count fractions are converted into chemical proportions, taking into account the gamma-ray production cross sections, as well as neutron and photon attenuation in the different shields between the ROV (Remotely Operated Vehicle) and the explosive, such as the explosive iron shell, seawater, and ROV envelop. A two-dimensional (2D) barycentric representation of the C, N, and O proportions is built from their chemical ratios, and a 2D likelihood map is built from the associated statistical and systematic uncertainties. The threat level is evaluated from the best matching materials of a database including explosives. (authors)

  14. Process analysis using ion mobility spectrometry.

    PubMed

    Baumbach, J I

    2006-03-01

    Ion mobility spectrometry, originally used to detect chemical warfare agents, explosives and illegal drugs, is now frequently applied in the field of process analytics. The method combines both high sensitivity (detection limits down to the ng to pg per liter and ppb(v)/ppt(v) ranges) and relatively low technical expenditure with a high-speed data acquisition. In this paper, the working principles of IMS are summarized with respect to the advantages and disadvantages of the technique. Different ionization techniques, sample introduction methods and preseparation methods are considered. Proven applications of different types of ion mobility spectrometer (IMS) used at ISAS will be discussed in detail: monitoring of gas insulated substations, contamination in water, odoration of natural gas, human breath composition and metabolites of bacteria. The example applications discussed relate to purity (gas insulated substations), ecology (contamination of water resources), plants and person safety (odoration of natural gas), food quality control (molds and bacteria) and human health (breath analysis).

  15. Process analysis using ion mobility spectrometry.

    PubMed

    Baumbach, J I

    2006-03-01

    Ion mobility spectrometry, originally used to detect chemical warfare agents, explosives and illegal drugs, is now frequently applied in the field of process analytics. The method combines both high sensitivity (detection limits down to the ng to pg per liter and ppb(v)/ppt(v) ranges) and relatively low technical expenditure with a high-speed data acquisition. In this paper, the working principles of IMS are summarized with respect to the advantages and disadvantages of the technique. Different ionization techniques, sample introduction methods and preseparation methods are considered. Proven applications of different types of ion mobility spectrometer (IMS) used at ISAS will be discussed in detail: monitoring of gas insulated substations, contamination in water, odoration of natural gas, human breath composition and metabolites of bacteria. The example applications discussed relate to purity (gas insulated substations), ecology (contamination of water resources), plants and person safety (odoration of natural gas), food quality control (molds and bacteria) and human health (breath analysis). PMID:16132133

  16. Developmental trends in auditory processing can provide early predictions of language acquisition in young infants.

    PubMed

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R; Shao, Jie; Lozoff, Betsy

    2013-03-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory - Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  17. An underground tale: contribution of microbial activity to plant iron acquisition via ecological processes

    PubMed Central

    Jin, Chong Wei; Ye, Yi Quan; Zheng, Shao Jian

    2014-01-01

    Background Iron (Fe) deficiency in crops is a worldwide agricultural problem. Plants have evolved several strategies to enhance Fe acquisition, but increasing evidence has shown that the intrinsic plant-based strategies alone are insufficient to avoid Fe deficiency in Fe-limited soils. Soil micro-organisms also play a critical role in plant Fe acquisition; however, the mechanisms behind their promotion of Fe acquisition remain largely unknown. Scope This review focuses on the possible mechanisms underlying the promotion of plant Fe acquisition by soil micro-organisms. Conclusions Fe-deficiency-induced root exudates alter the microbial community in the rhizosphere by modifying the physicochemical properties of soil, and/or by their antimicrobial and/or growth-promoting effects. The altered microbial community may in turn benefit plant Fe acquisition via production of siderophores and protons, both of which improve Fe bioavailability in soil, and via hormone generation that triggers the enhancement of Fe uptake capacity in plants. In addition, symbiotic interactions between micro-organisms and host plants could also enhance plant Fe acquisition, possibly including: rhizobium nodulation enhancing plant Fe uptake capacity and mycorrhizal fungal infection enhancing root length and the nutrient acquisition area of the root system, as well as increasing the production of Fe3+ chelators and protons. PMID:24265348

  18. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  19. The Effects of Word Exposure Frequency and Elaboration of Word Processing on Incidental L2 Vocabulary Acquisition through Reading

    ERIC Educational Resources Information Center

    Eckerth, Johannes; Tavakoli, Parveneh

    2012-01-01

    Research on incidental second language (L2) vocabulary acquisition through reading has claimed that repeated encounters with unfamiliar words and the relative elaboration of processing these words facilitate word learning. However, so far both variables have been investigated in isolation. To help close this research gap, the current study…

  20. The Comparative Effects of Processing Instruction and Dictogloss on the Acquisition of the English Passive by Speakers of Turkish

    ERIC Educational Resources Information Center

    Uludag, Onur; Vanpatten, Bill

    2012-01-01

    The current study presents the results of an experiment investigating the effects of processing instruction (PI) and dictogloss (DG) on the acquisition of the English passive voice. Sixty speakers of Turkish studying English at university level were assigned to three groups: one receiving PI, the other receiving DG and the third serving as a…

  1. Combining Contextual and Morphemic Cues Is Beneficial during Incidental Vocabulary Acquisition: Semantic Transparency in Novel Compound Word Processing

    ERIC Educational Resources Information Center

    Brusnighan, Stephen M.; Folk, Jocelyn R.

    2012-01-01

    In two studies, we investigated how skilled readers use contextual and morphemic information in the process of incidental vocabulary acquisition during reading. In Experiment 1, we monitored skilled readers' eye movements while they silently read sentence pairs containing novel and known English compound words that were either semantically…

  2. The Acceleration of Spoken-Word Processing in Children's Native-Language Acquisition: An ERP Cohort Study

    ERIC Educational Resources Information Center

    Ojima, Shiro; Matsuba-Kurita, Hiroko; Nakamura, Naoko; Hagiwara, Hiroko

    2011-01-01

    Healthy adults can identify spoken words at a remarkable speed, by incrementally analyzing word-onset information. It is currently unknown how this adult-level speed of spoken-word processing emerges during children's native-language acquisition. In a picture-word mismatch paradigm, we manipulated the semantic congruency between picture contexts…

  3. Immunological processes underlying the slow acquisition of humoral immunity to malaria.

    PubMed

    Ryg-Cornejo, Victoria; Ly, Ann; Hansen, Diana S

    2016-02-01

    Malaria is one of the most serious infectious diseases with ~250 million clinical cases annually. Most cases of severe disease are caused by Plasmodium falciparum. The blood stage of Plasmodium parasite is entirely responsible for malaria-associated pathology. Disease syndromes range from fever to more severe complications, including respiratory distress, metabolic acidosis, renal failure, pulmonary oedema and cerebral malaria. The most susceptible population to severe malaria is children under the age of 5, with low levels of immunity. It is only after many years of repeated exposure, that individuals living in endemic areas develop clinical immunity. This form of protection does not result in sterilizing immunity but prevents clinical episodes by substantially reducing parasite burden. Naturally acquired immunity predominantly targets blood-stage parasites and it is known to require antibody responses. A large body of epidemiological evidence suggests that antibodies to Plasmodium antigens are inefficiently generated and rapidly lost in the absence of ongoing exposure, which suggests a defect in the development of B cell immunological memory. This review summarizes the main findings to date contributing to our understanding on cellular processes underlying the slow acquisition of humoral immunity to malaria. Some of the key outstanding questions in the field are discussed.

  4. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    SciTech Connect

    Alai, M; Glascoe, L; Love, A; Johnson, M; Einfeld, W

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) was to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.

  5. Parameter identification of process simulation models as a means for knowledge acquisition and technology transfer

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Ifanti, Konstantina

    2012-12-01

    Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.

  6. Data acquisition and analysis procedures for high-resolution atomic force microscopy in three dimensions.

    PubMed

    Albers, Boris J; Schwendemann, Todd C; Baykara, Mehmet Z; Pilet, Nicolas; Liebmann, Marcus; Altman, Eric I; Schwarz, Udo D

    2009-07-01

    Data acquisition and analysis procedures for noncontact atomic force microscopy that allow the recording of dense three-dimensional (3D) surface force and energy fields with atomic resolution are presented. The main obstacles for producing high-quality 3D force maps are long acquisition times that lead to data sets being distorted by drift, and tip changes. Both problems are reduced but not eliminated by low-temperature operation. The procedures presented here employ an image-by-image data acquisition scheme that cuts measurement times by avoiding repeated recording of redundant information, while allowing post-acquisition drift correction. All steps are detailed with the example of measurements performed on highly oriented pyrolytic graphite in ultrahigh vacuum at a temperature of 6 K. The area covered spans several unit cells laterally and vertically from the attractive region to where no force could be measured. The resulting fine data mesh maps piconewton forces with <7 pm lateral and<2 pm vertical resolution. From this 3D data set, two-dimensional cuts along any plane can be plotted. Cuts in a plane parallel to the sample surface show atomic resolution, while cuts along the surface normal visualize how the attractive atomic force fields extend into vacuum. At the same time, maps of the tip-sample potential energy, the lateral tip-sample forces, and the energy dissipated during cantilever oscillation can be produced with identical resolution.

  7. In-Depth Analysis of Computer Memory Acquisition Software for Forensic Purposes.

    PubMed

    McDown, Robert J; Varol, Cihan; Carvajal, Leonardo; Chen, Lei

    2016-01-01

    The comparison studies on random access memory (RAM) acquisition tools are either limited in metrics or the selected tools were designed to be executed in older operating systems. Therefore, this study evaluates widely used seven shareware or freeware/open source RAM acquisition forensic tools that are compatible to work with the latest 64-bit Windows operating systems. These tools' user interface capabilities, platform limitations, reporting capabilities, total execution time, shared and proprietary DLLs, modified registry keys, and invoked files during processing were compared. We observed that Windows Memory Reader and Belkasoft's Live Ram Capturer leaves the least fingerprints in memory when loaded. On the other hand, ProDiscover and FTK Imager perform poor in memory usage, processing time, DLL usage, and not-wanted artifacts introduced to the system. While Belkasoft's Live Ram Capturer is the fastest to obtain an image of the memory, Pro Discover takes the longest time to do the same job.

  8. In-Depth Analysis of Computer Memory Acquisition Software for Forensic Purposes.

    PubMed

    McDown, Robert J; Varol, Cihan; Carvajal, Leonardo; Chen, Lei

    2016-01-01

    The comparison studies on random access memory (RAM) acquisition tools are either limited in metrics or the selected tools were designed to be executed in older operating systems. Therefore, this study evaluates widely used seven shareware or freeware/open source RAM acquisition forensic tools that are compatible to work with the latest 64-bit Windows operating systems. These tools' user interface capabilities, platform limitations, reporting capabilities, total execution time, shared and proprietary DLLs, modified registry keys, and invoked files during processing were compared. We observed that Windows Memory Reader and Belkasoft's Live Ram Capturer leaves the least fingerprints in memory when loaded. On the other hand, ProDiscover and FTK Imager perform poor in memory usage, processing time, DLL usage, and not-wanted artifacts introduced to the system. While Belkasoft's Live Ram Capturer is the fastest to obtain an image of the memory, Pro Discover takes the longest time to do the same job. PMID:27405017

  9. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA

  10. Wearable system for acquisition, processing and storage of the signal from amperometric glucose sensors.

    PubMed

    Fabietti, P G; Massi Benedetti, M; Bronzo, F; Reboldi, G P; Sarti, E; Brunetti, P

    1991-03-01

    A wearable device for the acquisition, processing and storage of the signal from needle-type glucose sensors has been designed and developed as part of a project aimed at developing a portable artificial pancreas. The device is essential to assess the operational characteristics of miniaturized sensors in vivo. It can be connected to sensors operating at a constant potential of 0.65 Volts, and generating currents in the order of 10(-9) Amp. It is screened and equipped with filters that permit data recording and processing even in the presence of electrical noise. It can operate with sensors with different characteristics (1-200 nA full scale). The device has been designed to be worn by patients, so its weight and size have been kept to a minimum (250 g; 8.5 x 14.5 x 3.5 cm). It is powered by rechargeable Ni/Cd batteries allowing continuous operation for 72 h. The electronics consists of an analog card with operational amplifiers, and a digital one with a microprocessor (Intel 80C196, MCS-96 class, with internal 16-bit CPU supporting programs written in either C or Assembler language), a 32 Kb EPROM, and an 8 Kb RAM where the data are stored. The microprocessor can run either at 5 or 10 Mhz and features on-chip peripherals: an analog/digital (A/D) converter, a serial port (used to transfer data to a Personal Computer at the end of the 72 h), input-output (I/O) units at high-speed, and two timers. The device is programmed and prepared to operate by means of a second hand-held unit equipped with an LCD display and a 16-key numeric pad.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Computer programs for absolute neutron activation analysis on the nuclear data 6620 data acquisition system

    SciTech Connect

    Wade, J.W.; Emery, J.F.

    1982-03-01

    Five computer programs that provide multielement neutron activation analysis are discussed. The software package was designed for use on the Nuclear Data 6620 Data Acquisition System and interacts with existing Nuclear Data Corporation software. The programs were developed to make use of the capabilities of the 6620 system to analyze large numbers of samples and assist in a large sample workload that had begun in the neutron activation analysis facility of the Oak Ridge Research Reactor. Nuclear Data neutron activation software is unable to perform absolute activation analysis and therefore was inefficient and inadequate for our applications.

  12. Multifunctional data acquisition and analysis and optical sensors: a Bonneville Power Administration (BPA) update

    NASA Astrophysics Data System (ADS)

    Erickson, Dennis C.; Donnelly, Matt K.

    1995-04-01

    The authors present a design concept describing a multifunctional data acquisition and analysis architecture for advanced power system monitoring. The system is tailored to take advantage of the salient features of low energy sensors, particularly optical types. The discussion of the system concept and optical sensors is based on research at BPA and PNL and on progress made at existing BPA installations and other sites in the western power system.

  13. Explaining the "Natural Order of L2 Morpheme Acquisition" in English: A Meta-Analysis of Multiple Determinants

    ERIC Educational Resources Information Center

    Goldschneider, Jennifer M.; DeKeyser, Robert M.

    2005-01-01

    This meta-analysis pools data from 25 years of research on the order of acquisition of English grammatical morphemes by students of English as a second language (ESL). Some researchers have posited a "natural" order of acquisition common to all ESL learners, but no single cause has been shown for this phenomenon. Our study investigated whether a…

  14. The Earthscope USArray Array Network Facility (ANF): Evolution of Data Acquisition, Processing, and Storage Systems

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Battistuz, B.; Foley, S.; Vernon, F. L.; Eakins, J. A.

    2009-12-01

    Since April 2004 the Earthscope USArray Transportable Array (TA) network has grown to over 400 broadband seismic stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. In total, over 1.7 terabytes per year of 24-bit, 40 samples-per-second seismic and state of health data is recorded from the stations. The ANF provides analysts access to real-time and archived data, as well as state-of-health data, metadata, and interactive tools for station engineers and the public via a website. Additional processing and recovery of missing data from on-site recorders (balers) at the stations is performed before the final data is transmitted to the IRIS Data Management Center (DMC). Assembly of the final data set requires additional storage and processing capabilities to combine the real-time data with baler data. The infrastructure supporting these diverse computational and storage needs currently consists of twelve virtualized Sun Solaris Zones executing on nine physical server systems. The servers are protected against failure by redundant power, storage, and networking connections. Storage needs are provided by a hybrid iSCSI and Fiber Channel Storage Area Network (SAN) with access to over 40 terabytes of RAID 5 and 6 storage. Processing tasks are assigned to systems based on parallelization and floating-point calculation needs. On-site buffering at the data-loggers provide protection in case of short-term network or hardware problems, while backup acquisition systems at the San Diego Supercomputer Center and the DMC protect against catastrophic failure of the primary site. Configuration management and monitoring of these systems is accomplished with open-source (Cfengine, Nagios, Solaris Community Software) and commercial tools (Intermapper). In the evolution from a single server to multiple virtualized server instances, Sun Cluster software was evaluated and found to be unstable in our environment. Shared filesystem

  15. Proceedings of the XIIIth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition, and Processing

    USGS Publications Warehouse

    Love, Jeffrey J.

    2009-01-01

    The thirteenth biennial International Association of Geomagnetism and Aeronomy (IAGA) Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing was held in the United States for the first time on June 9-18, 2008. Hosted by the U.S. Geological Survey's (USGS) Geomagnetism Program, the workshop's measurement session was held at the Boulder Observatory and the scientific session was held on the campus of the Colorado School of Mines in Golden, Colorado. More than 100 participants came from 36 countries and 6 continents. Preparation for the workshop began when the USGS Geomagnetism Program agreed, at the close of the twelfth workshop in Belsk Poland in 2006, to host the next workshop. Working under the leadership of Alan Berarducci, who served as the chairman of the local organizing committee, and Tim White, who served as co-chairman, preparations began in 2007. The Boulder Observatory was extensively renovated and additional observation piers were installed. Meeting space on the Colorado School of Mines campus was arranged, and considerable planning was devoted to managing the many large and small issues that accompany an international meeting. Without the devoted efforts of both Alan and Tim, other Geomagnetism Program staff, and our partners at the Colorado School of Mines, the workshop simply would not have occurred. We express our thanks to Jill McCarthy, the USGS Central Region Geologic Hazards Team Chief Scientist; Carol A. Finn, the Group Leader of the USGS Geomagnetism Program; the USGS International Office; and Melody Francisco of the Office of Special Programs and Continuing Education of the Colorado School of Mines. We also thank the student employees that the Geomagnetism Program has had over the years and leading up to the time of the workshop. For preparation of the proceedings, thanks go to Eddie and Tim. And, finally, we thank our sponsors, the USGS, IAGA, and the Colorado School of Mines.

  16. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    SciTech Connect

    Schickert, Martin

    2015-03-31

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  17. Using predictive uncertainty analysis to optimise tracer test design and data acquisition

    NASA Astrophysics Data System (ADS)

    Wallis, Ilka; Moore, Catherine; Post, Vincent; Wolf, Leif; Martens, Evelien; Prommer, Henning

    2014-07-01

    processes, followed by methane. Temperature data was assessed as the least informative of the solute tracers. However, taking costs of data acquisition into account, it could be shown that temperature data when used in conjunction with other tracers was a valuable and cost-effective marker species due to temperatures low cost to worth ratio. In contrast, the high costs of acquisition of methane data compared to its muted worth, highlighted methanes unfavourable return on investment. Areas of optimal monitoring bore position as well as optimal numbers of bores for the investigated injection site were also established. The proposed tracer test optimisation is done through the application of common use groundwater flow and transport models in conjunction with publicly available tools for predictive uncertainty analysis to provide modelers and practitioners with a powerful yet efficient and cost effective tool which is generally applicable and easily transferrable from the present study to many applications beyond the case study of injection of treated CSG produced water.

  18. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  19. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  20. Video-task acquisition in rhesus monkeys (Macaca mulatta) and chimpanzees (Pan troglodytes): a comparative analysis.

    PubMed

    Hopkins, W D; Washburn, D A; Hyatt, C W

    1996-04-01

    This study describes video-task acquisition in two nonhuman primate species. The subjects were seven rhesus monkeys (Macaca mulatta) and seven chimpanzees (Pan troglodytes). All subjects were trained to manipulate a joystick which controlled a cursor displayed on a computer monitor. Two criterion levels were used: one based on conceptual knowledge of the task and one based on motor performance. Chimpanzees and rhesus monkeys attained criterion in a comparable number of trials using a conceptually based criterion. However, using a criterion based on motor performance, chimpanzees reached criterion significantly faster than rhesus monkeys. Analysis of error patterns and latency indicated that the rhesus monkeys had a larger asymmetry in response bias and were significantly slower in responding than the chimpanzees. The results are discussed in terms of the relation between object manipulation skills and video-task acquisition.

  1. Video-task acquisition in rhesus monkeys (Macaca mulatta) and chimpanzees (Pan troglodytes): a comparative analysis

    NASA Technical Reports Server (NTRS)

    Hopkins, W. D.; Washburn, D. A.; Hyatt, C. W.; Rumbaugh, D. M. (Principal Investigator)

    1996-01-01

    This study describes video-task acquisition in two nonhuman primate species. The subjects were seven rhesus monkeys (Macaca mulatta) and seven chimpanzees (Pan troglodytes). All subjects were trained to manipulate a joystick which controlled a cursor displayed on a computer monitor. Two criterion levels were used: one based on conceptual knowledge of the task and one based on motor performance. Chimpanzees and rhesus monkeys attained criterion in a comparable number of trials using a conceptually based criterion. However, using a criterion based on motor performance, chimpanzees reached criterion significantly faster than rhesus monkeys. Analysis of error patterns and latency indicated that the rhesus monkeys had a larger asymmetry in response bias and were significantly slower in responding than the chimpanzees. The results are discussed in terms of the relation between object manipulation skills and video-task acquisition.

  2. A Fast VME Data Acquisition System for Spill Analysis and Beam Loss Measurement

    NASA Astrophysics Data System (ADS)

    Hoffmann, T.; Liakin, D. A.; Forck, P.

    2002-12-01

    Particle counters perform the control of beam loss and slowly extracted currents at the heavy ion synchrotron (SIS) at GSI. For these devices a new data acquisition system has been developed with the main intention to combine the operating purposes beam loss measurement, spill analysis, spill structure measurement and matrix switching functionality in one single assembly. To provide a reasonable digital selection of counters at significant locations a modular VME setup based on the GSI data acquisition software MBS (Multi Branch System) was chosen. An overview of the design regarding the digital electronics and the infrastructure is given. Of main interest in addition to the high performance of the used hardware is the development of a user-friendly software interface for hardware controls, data evaluation and presentation to the operator.

  3. Hospital integration and vertical consolidation: an analysis of acquisitions in New York State.

    PubMed

    Huckman, Robert S

    2006-01-01

    While prior studies tend to view hospital integration through the lens of horizontal consolidation, I provide an analysis of its vertical aspects. I examine the effect of hospital acquisitions in New York State on the distribution of market share for major cardiac procedures across providers in target markets. I find evidence of benefits to acquirers via business stealing, with the resulting redistribution of volume across providers having small effects, if any, on total welfare with respect to cardiac care. The results of this analysis -- along with similar assessments for other services -- can be incorporated into future studies of hospital consolidation.

  4. A software surety analysis process

    SciTech Connect

    Trauth, S.; Tempel, P.

    1995-11-01

    As part of the High Consequence System Surety project, this work was undertaken to explore, one approach to conducting a surety theme analysis for a software-driven system. Originally, plans were to develop a theoretical approach to the analysis, and then to validate and refine this process by applying it to the software being developed for the Weight and Leak Check System (WALS), an automated nuclear weapon component handling system. As with the development of the higher level High consequence System surety Process, this work was not completed due to changes in funding levels. This document describes the software analysis process, discusses its application in a software, environment, and outlines next steps that could be taken to further develop and apply the approach to projects.

  5. Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.

    PubMed

    Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

    2012-11-01

    This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others.

  6. Analysis and decision document in support of acquisition of steam supply for the Hanford 200 Area

    SciTech Connect

    Brown, D.R.; Daellenbach, K.K.; Hendrickson, P.L.; Kavanaugh, D.C.; Reilly, R.W.; Shankle, D.L.; Smith, S.A.; Weakley, S.A.; Williams, T.A.; Grant, T.F.

    1992-02-01

    The US Department of Energy (DOE) is now evaluating its facility requirements in support of its cleanup mission at Hanford. One of the early findings is that the 200-Area steam plants, constructed in 1943, will not meet future space heating and process needs. Because the 200 Area will serve as the primary area for waste treatment and long-term storage, a reliable steam supply is a critical element of Hanford operations. This Analysis and Decision Document (ADD) is a preliminary review of the steam supply options available to the DOE. The ADD contains a comprehensive evaluation of the two major acquisition options: line-term versus privatization. It addresses the life-cycle costs associated with each alternative, as well as factors such as contracting requirements and the impact of market, safety, security, and regulatory issues. Specifically, this ADD documents current and future steam requirements for the 200 Area, describes alternatives available to DOE for meeting these requirements, and compares the alternatives across a number of decision criteria, including life-cycle cost. DOE has currently limited the ADD evaluation alternatives to replacing central steam plants rather than expanding the study to include alternative heat sources, such as a distributed network of boilers or heat pumps. Thirteen project alternatives were analyzed in the ADD. One of the alternatives was the rehabilitation of the existing 200-East coal-fired facility. The other twelve alternatives are combinations of (1) coal- or gas-fueled plants, (2) steam-only or cogeneration facilities, (3) primary or secondary cogeneration of electricity, and (4) public or private ownership.

  7. Data acquisition and analysis of the UNCOSS underwater explosive neutron sensor

    SciTech Connect

    Carasco, C.; Eleon, C.; Perot, B.; Boudergui, K.; Kondrasovs, V.; Corre, G.; Normand, S.; Sannie, G.; Woo, R.; Bourbotte, J. M.

    2011-07-01

    The purpose of the FP7 UNCOSS project (Underwater Coastal Sea Surveyor, http://www.uncoss-project.org) is to develop a neutron-based underwater explosive sensor to detect unexploded ordnance lying on the sea bottom. The Associated Particle Technique is used to focus the inspection on a suspicious object located by optical and electromagnetic sensors and to determine if there is an explosive charge inside. This paper presents the data acquisition electronics and data analysis software which have been developed for this project. The electronics digitize and process the signal in real-time based on a field programmable gate array structure to perform precise time-of-flight and gamma-ray energy measurements. UNCOSS software offers the basic tools to analyze the time-of-flight and energy spectra of the interrogated object. It allows to unfold the gamma-ray spectrum into pure elemental count proportions, mainly C, N, O, Fe, Al, Si, and Ca. The C, N, and O count fractions are converted into chemical proportions by taking into account the gamma-ray production cross sections, as well as neutron and photon attenuation in the different shields between the ROV (Remotely Operated Vehicle) and the explosive, such as the explosive iron shell, seawater, and ROV envelop. These chemical ratios are plotted in a two-dimensional (2D) barycentric representation to position the measured point with respect to common explosives. The systematic uncertainty due to the above attenuation effects and counting statistical fluctuations are combined with a Monte Carlo method to provide a 3D uncertainty area in a barycentric plot, which allows to determine the most probable detected materials in view to make a decision about the presence of explosive. (authors)

  8. The history of imitation in learning theory: the language acquisition process.

    PubMed

    Kymissis, E; Poulson, C L

    1990-09-01

    The concept of imitation has undergone different analyses in the hands of different learning theorists throughout the history of psychology. From Thorndike's connectionism to Pavlov's classical conditioning, Hull's monistic theory, Mowrer's two-factor theory, and Skinner's operant theory, there have been several divergent accounts of the conditions that produce imitation and the conditions under which imitation itself may facilitate language acquisition. In tracing the roots of the concept of imitation in the history of learning theory, the authors conclude that generalized imitation, as defined and analyzed by operant learning theorists, is a sufficiently robust formulation of learned imitation to facilitate a behavior-analytic account of first-language acquisition.

  9. Dynamic analysis of process reactors

    SciTech Connect

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1996-12-31

    The process design for integration of advanced gasifiers for combined-cycle facilities requires a dynamic analysis tool for predicting the gasifier performance and stability. Such a tool provides an understanding of both process reactions and the interaction of process components. To illustrate the utility of the process dynamic tool, a Gasifier Dynamic Model (GDM) was developed at the Morgantown Energy Technology Center (METC) to investigate alternative designs and operational scenarios during process design development. Empirical data and first principles were combined into steady-state process models to develop sensitivity parameters around a nominal process design condition. These gain factors were then coupled with time-dependent functions for process mass and energy inventories to develop the dynamic model (GDM). Engineering calculations performed in the GDM were used to predict process responses such as gas make, flow, pressure, and temperature. Small research facilities were constructed and operated to validate both the steady-state process and dynamic models. GDM predictions provided engineers insights into the design integrity and operational safety of the reactions, components, and control elements.

  10. Logistics Process Analysis ToolProcess Analysis Tool

    SciTech Connect

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).

  11. Logistics Process Analysis ToolProcess Analysis Tool

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  12. Optionality in Second Language Acquisition: A Generative, Processing-Oriented Account

    ERIC Educational Resources Information Center

    Truscott, John

    2006-01-01

    The simultaneous presence in a learner's grammar of two features that should be mutually exclusive (optionality) typifies second language acquisition. But generative approaches have no good means of accommodating the phenomenon. The paper proposes one approach, based on Truscott and Sharwood Smith's (2004) MOGUL framework. In this framework,…

  13. A Tale of Two Career Paths: The Process of Status Acquisition by a New Organizational Unit.

    ERIC Educational Resources Information Center

    Briody, Elizabeth K.; And Others

    1995-01-01

    Interviews with 39 sales/service employees of General Motors' new Telemarketing Assistance Group identified factors influencing the acquisition of status in organizations: reorganization, managerial decision making, employee interpretations and reactions, and community consensus. The status of organizational units was related to career mobility.…

  14. The Representation and Processing of Familiar Faces in Dyslexia: Differences in Age of Acquisition Effects

    ERIC Educational Resources Information Center

    Smith-Spark, James H.; Moore, Viv

    2009-01-01

    Two under-explored areas of developmental dyslexia research, face naming and age of acquisition (AoA), were investigated. Eighteen dyslexic and 18 non-dyslexic university students named the faces of 50 well-known celebrities, matched for facial distinctiveness and familiarity. Twenty-five of the famous people were learned early in life, while the…

  15. Directed Blogging with Community College ESL Students: Its Effects on Awareness of Language Acquisition Processes

    ERIC Educational Resources Information Center

    Johnson, Cathy

    2012-01-01

    English as a Second Language (ESL) students often have problems progressing in their acquisition of the language and frequently do not know how to solve this dilemma. Many of them think of their second language studies as just another school subject that they must pass in order to move on to the next level, so few of them realize the metacognitive…

  16. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  17. The influence of the microscope lamp filament colour temperature on the process of digital images of histological slides acquisition standardization

    PubMed Central

    2014-01-01

    Background The aim of this study is to compare the digital images of the tissue biopsy captured with optical microscope using bright field technique under various light conditions. The range of colour's variation in immunohistochemically stained with 3,3'-Diaminobenzidine and Haematoxylin tissue samples is immense and coming from various sources. One of them is inadequate setting of camera's white balance to microscope's light colour temperature. Although this type of error can be easily handled during the stage of image acquisition, it can be eliminated with use of colour adjustment algorithms. The examination of the dependence of colour variation from microscope's light temperature and settings of the camera is done as an introductory research to the process of automatic colour standardization. Methods Six fields of view with empty space among the tissue samples have been selected for analysis. Each field of view has been acquired 225 times with various microscope light temperature and camera white balance settings. The fourteen randomly chosen images have been corrected and compared, with the reference image, by the following methods: Mean Square Error, Structural SIMilarity and visual assessment of viewer. Results For two types of backgrounds and two types of objects, the statistical image descriptors: range, median, mean and its standard deviation of chromaticity on a and b channels from CIELab colour space, and luminance L, and local colour variability for objects' specific area have been calculated. The results have been averaged for 6 images acquired in the same light conditions and camera settings for each sample. Conclusions The analysis of the results leads to the following conclusions: (1) the images collected with white balance setting adjusted to light colour temperature clusters in certain area of chromatic space, (2) the process of white balance correction for images collected with white balance camera settings not matched to the light temperature

  18. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  19. The Probabilistic Analysis of Language Acquisition: Theoretical, Computational, and Experimental Analysis

    ERIC Educational Resources Information Center

    Hsu, Anne S.; Chater, Nick; Vitanyi, Paul M. B.

    2011-01-01

    There is much debate over the degree to which language learning is governed by innate language-specific biases, or acquired through cognition-general principles. Here we examine the probabilistic language acquisition hypothesis on three levels: We outline a novel theoretical result showing that it is possible to learn the exact "generative model"…

  20. Data acquisition, preprocessing and analysis for the Virginia Tech OLYMPUS experiment

    NASA Technical Reports Server (NTRS)

    Remaklus, P. Will

    1991-01-01

    Virginia Tech is conducting a slant path propagation experiment using the 12, 20, and 30 GHz OLYMPUS beacons. Beacon signal measurements are made using separate terminals for each frequency. In addition, short baseline diversity measurements are collected through a mobile 20 GHz terminal. Data collection is performed with a custom data acquisition and control system. Raw data are preprocessed to remove equipment biases and discontinuities prior to analysis. Preprocessed data are then statistically analyzed to investigate parameters such as frequency scaling, fade slope and duration, and scintillation intensity.

  1. Proteomic Analysis of Embryogenesis and the Acquisition of Seed Dormancy in Norway Maple (Acer platanoides L.)

    PubMed Central

    Staszak, Aleksandra Maria; Pawłowski, Tomasz Andrzej

    2014-01-01

    The proteome of zygotic embryos of Acer platanoides L. was analyzed via high-resolution 2D-SDS-PAGE and MS/MS in order to: (1) identify significant physiological processes associated with embryo development; and (2) identify changes in the proteome of the embryo associated with the acquisition of seed dormancy. Seventeen spots were identified as associated with morphogenesis at 10 to 13 weeks after flowering (WAF). Thirty-three spots were associated with maturation of the embryo at 14 to 22 WAF. The greatest changes in protein abundance occurred at 22 WAF, when seeds become fully mature. Overall, the stage of morphogenesis was characterized by changes in the abundance of proteins (tubulins and actin) associated with the growth and development of the embryo. Enzymes related to energy supply were especially elevated, most likely due to the energy demand associated with rapid growth and cell division. The stage of maturation is crucial to the establishment of seed dormancy and is associated with a higher abundance of proteins involved in genetic information processing, energy and carbon metabolism and cellular and antioxidant processes. Results indicated that a glycine-rich RNA-binding protein and proteasome proteins may be directly involved in dormancy acquisition control, and future studies are warranted to verify this association. PMID:24941250

  2. Proteomic analysis of embryogenesis and the acquisition of seed dormancy in Norway maple (Acer platanoides L.).

    PubMed

    Staszak, Aleksandra Maria; Pawłowski, Tomasz Andrzej

    2014-06-17

    The proteome of zygotic embryos of Acer platanoides L. was analyzed via high-resolution 2D-SDS-PAGE and MS/MS in order to: (1) identify significant physiological processes associated with embryo development; and (2) identify changes in the proteome of the embryo associated with the acquisition of seed dormancy. Seventeen spots were identified as associated with morphogenesis at 10 to 13 weeks after flowering (WAF). Thirty-three spots were associated with maturation of the embryo at 14 to 22 WAF. The greatest changes in protein abundance occurred at 22 WAF, when seeds become fully mature. Overall, the stage of morphogenesis was characterized by changes in the abundance of proteins (tubulins and actin) associated with the growth and development of the embryo. Enzymes related to energy supply were especially elevated, most likely due to the energy demand associated with rapid growth and cell division. The stage of maturation is crucial to the establishment of seed dormancy and is associated with a higher abundance of proteins involved in genetic information processing, energy and carbon metabolism and cellular and antioxidant processes. Results indicated that a glycine-rich RNA-binding protein and proteasome proteins may be directly involved in dormancy acquisition control, and future studies are warranted to verify this association.

  3. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  4. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  5. DIADEM--a system for the interactive data acquisition and processing in an analytical laboratory.

    PubMed

    Peters, F; Teschner, W

    1979-09-01

    A conversational program for the acquisition of experimental data in a multi-user, multi-instrument computer system is described. It assists the researcher when recording on-time data. Due to the simple structure of the dialogue, no special knowledge of computer handling is required by the experimenter. Whereas the experimental methods are versatile, a uniform concept of the dialogue and the file structure is realized. PMID:487779

  6. The health hazard assessment process in support of joint weapon system acquisitions.

    PubMed

    Kluchinsky, Timothy A; Jokel, Charles R; Cambre, John V; Goddard, Donald E; Batts, Robert W

    2013-01-01

    Since 1981, the Army's HHA Program has provided an invaluable service to combat developers and materiel program managers by providing recommendations designed to eliminate or control health hazards associated with materiel and weapon systems. The program has consistently strived to improve its services by providing more meaningful and efficient assistance to the acquisition community. In the uncertain fiscal times ahead, the Army's HHA Program will continue to provide valuable and cost-effective solutions to mitigate the health risks of weapons systems.

  7. Hippocampal Context Processing during Acquisition of a Predictive Learning Task Is Associated with Renewal in Extinction Recall.

    PubMed

    Lissek, Silke; Glaubitz, Benjamin; Schmidt-Wilcke, Tobias; Tegenthoff, Martin

    2016-05-01

    Renewal is defined as the recovery of an extinguished response if extinction and retrieval contexts differ. The context dependency of extinction, as demonstrated by renewal, has important implications for extinction-based therapies. Persons showing renewal (REN) exhibit higher hippocampal activation during extinction in associative learning than those without renewal (NOREN), demonstrating hippocampal context processing, and recruit ventromedial pFC in retrieval. Apart from these findings, brain processes generating renewal remain largely unknown. Conceivably, processing differences in task-relevant brain regions that ultimately lead to renewal may occur already in initial acquisition of associations. Therefore, in two fMRI studies, we investigated overall brain activation and hippocampal activation in REN and NOREN during acquisition of an associative learning task in response to presentation of a context alone or combined with a cue. Results of two studies demonstrated significant activation differences between the groups: In Study 1, a support vector machine classifier correctly assigned participants' brain activation patterns to REN and NOREN groups, respectively. In Study 2, REN and NOREN showed similar hippocampal involvement during context-only presentation, suggesting processing of novelty, whereas overall hippocampal activation to the context-cue compound, suggesting compound encoding, was higher in REN. Positive correlations between hippocampal activation and renewal level indicated more prominent hippocampal processing in REN. Results suggest that hippocampal processing of the context-cue compound rather than of context only during initial learning is related to a subsequent renewal effect. Presumably, REN participants use distinct encoding strategies during acquisition of context-related tasks, which reflect in their brain activation patterns and contribute to a renewal effect. PMID:26807840

  8. Data acquisition and analysis for the Fermilab Collider RunII

    SciTech Connect

    Paul L. G. Lebrun et al.

    2004-07-07

    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Acquisition and Analysis (SDA) has been developed to fulfill this need. The SDA takes a standard set of critical data at relevant stages during the complex series of beam manipulations leading to {radical}(s) {approx} 2 TeV collisions. Data is stored in a relational database, and is served to programs and users via Web based tools. Summary tables are systematically generated during and after a store. Written entirely in Java, SDA supports both interactive tools and application interfaces used for in-depth analysis. In this talk, we present the architecture and described some of our analysis tools. We also present some results on the recent Tevatron performance as illustrations of the capabilities of SDA.

  9. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  10. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  11. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  12. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  13. User's guide to noise data acquisition and analysis programs for HP9845: Nicolet analyzers

    NASA Technical Reports Server (NTRS)

    Mcgary, M. C.

    1982-01-01

    A software interface package was written for use with a desktop computer and two models of single channel Fast Fourier analyzers. This software features a portable measurement and analysis system with several options. Two types of interface hardware can alternately be used in conjunction with the software. Either an IEEE-488 Bus interface or a 16-bit parallel system may be used. Two types of storage medium, either tape cartridge or floppy disc can be used with the software. Five types of data may be stored, plotted, and/or printed. The data types include time histories, narrow band power spectra, and narrow band, one-third octave band, or octave band sound pressure level. The data acquisition programming includes a front panel remote control option for the FFT analyzers. Data analysis options include choice of line type and pen color for plotting.

  14. Resource Prospector Instrumentation for Lunar Volatiles Prospecting, Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Captain, J.; Elphic, R.; Colaprete, A.; Zacny, Kris; Paz, A.

    2016-01-01

    Data gathered from lunar missions within the last two decades have significantly enhanced our understanding of the volatile resources available on the lunar surface, specifically focusing on the polar regions. Several orbiting missions such as Clementine and Lunar Prospector have suggested the presence of volatile ices and enhanced hydrogen concentrations in the permanently shadowed regions of the moon. The Lunar Crater Observation and Sensing Satellite (LCROSS) mission was the first to provide direct measurement of water ice in a permanently shadowed region. These missions with other orbiting assets have laid the groundwork for the next step in the exploration of the lunar surface; providing ground truth data of the volatiles by mapping the distribution and processing lunar regolith for resource extraction. This next step is the robotic mission Resource Prospector (RP). Resource Prospector is a lunar mission to investigate 'strategic knowledge gaps' (SKGs) for in-situ resource utilization (ISRU). The mission is proposed to land in the lunar south pole near a permanently shadowed crater. The landing site will be determined by the science team with input from broader international community as being near traversable landscape that has a high potential of containing elevated concentrations of volatiles such as water while maximizing mission duration. A rover will host the Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) payload for resource mapping and processing. The science instruments on the payload include a 1-meter drill, neutron spectrometer, a near infrared spectrometer, an operations camera, and a reactor with a gas chromatograph-mass spectrometer for volatile analysis. After the RP lander safely delivers the rover to the lunar surface, the science team will guide the rover team on the first traverse plan. The neutron spectrometer (NS) and near infrared (NIR) spectrometer instruments will be used as prospecting tools to guide

  15. Instrumenting the Intelligence Analysis Process

    SciTech Connect

    Hampson, Ernest; Cowley, Paula J.

    2005-05-02

    The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

  16. Development and application of a model for the analysis of trades between space launch system operations and acquisition costs

    NASA Astrophysics Data System (ADS)

    Nix, Michael B.

    2005-12-01

    Early design decisions in the development of space launch systems determine the costs to acquire and operate launch systems. Some sources indicate that as much as 90% of life cycle costs are fixed by the end of the critical design review phase. System characteristics determined by these early decisions are major factors in the acquisition cost of flight hardware elements and facilities and influence operations costs through the amount of maintenance and support labor required to sustain system function. Operations costs are also dependent on post-development management decisions regarding how much labor will be deployed to meet requirements of market demand and ownership profit. The ability to perform early trade-offs between these costs is vital to the development of systems that have the necessary capacity to provide service and are profitable to operate. An Excel-based prototype model was developed for making early analyses of trade-offs between the costs to operate a space launch system and to acquire the necessary assets to meet a given set of operational requirements. The model, integrating input from existing models and adding missing capability, allows the user to make such trade-offs across a range of operations concepts (required flight rates, staffing levels, shifts per workday, workdays per week and per year, unreliability, wearout and depot maintenance) and the number, type and capability of assets (flight hardware elements, processing and supporting facilities and infrastructure). The costs and capabilities of hypothetical launch systems can be modeled as a function of interrelated turnaround times and labor resource levels, and asset loss and retirement. The number of flight components and facilities required can be calculated and the operations and acquisition costs compared for a specified scenario. Findings, based on the analysis of a hypothetical two stage to orbit, reusable, unmanned launch system, indicate that the model is suitable for the

  17. Hormonal Contraception and the Risk of HIV Acquisition: An Individual Participant Data Meta-analysis

    PubMed Central

    Morrison, Charles S.; Chen, Pai-Lien; Kwok, Cynthia; Baeten, Jared M.; Brown, Joelle; Crook, Angela M.; Van Damme, Lut; Delany-Moretlwe, Sinead; Francis, Suzanna C.; Friedland, Barbara A.; Hayes, Richard J.; Heffron, Renee; Kapiga, Saidi; Karim, Quarraisha Abdool; Karpoff, Stephanie; Kaul, Rupert; McClelland, R. Scott; McCormack, Sheena; McGrath, Nuala; Myer, Landon; Rees, Helen; van der Straten, Ariane; Watson-Jones, Deborah; van de Wijgert, Janneke H. H. M.; Stalter, Randy; Low, Nicola

    2015-01-01

    Background Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. Methods and Findings Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15–49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24–1.83) for DMPA use, 1.24 (95% CI 0.84–1.82) for NET-EN use, and 1.03 (95% CI 0.88–1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23–1.67) and NET-EN use (aHR 1.32, 95% CI 1.08–1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99–1.50; for NET-EN use 0.67, 95% CI 0.47–0.96; and for COC use 0.91, 95% CI 0.73–1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC–HIV relationship. Conclusions This IPD meta-analysis found no evidence that COC or NET-EN use increases women’s risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe

  18. Fast nearly ML estimation of Doppler frequency in GNSS signal acquisition process.

    PubMed

    Tang, Xinhua; Falletti, Emanuela; Lo Presti, Letizia

    2013-04-29

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains.

  19. Fast Nearly ML Estimation of Doppler Frequency in GNSS Signal Acquisition Process

    PubMed Central

    Tang, Xinhua; Falletti, Emanuela; Presti, Letizia Lo

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  20. Structural analysis of vibroacoustical processes

    NASA Technical Reports Server (NTRS)

    Gromov, A. P.; Myasnikov, L. L.; Myasnikova, Y. N.; Finagin, B. A.

    1973-01-01

    The method of automatic identification of acoustical signals, by means of the segmentation was used to investigate noises and vibrations in machines and mechanisms, for cybernetic diagnostics. The structural analysis consists of presentation of a noise or vibroacoustical signal as a sequence of segments, determined by the time quantization, in which each segment is characterized by specific spectral characteristics. The structural spectrum is plotted as a histogram of the segments, also as a relation of the probability density of appearance of a segment to the segment type. It is assumed that the conditions of ergodic processes are maintained.

  1. Radar data processing and analysis

    NASA Technical Reports Server (NTRS)

    Ausherman, D.; Larson, R.; Liskow, C.

    1976-01-01

    Digitized four-channel radar images corresponding to particular areas from the Phoenix and Huntington test sites were generated in conjunction with prior experiments performed to collect X- and L-band synthetic aperture radar imagery of these two areas. The methods for generating this imagery are documented. A secondary objective was the investigation of digital processing techniques for extraction of information from the multiband radar image data. Following the digitization, the remaining resources permitted a preliminary machine analysis to be performed on portions of the radar image data. The results, although necessarily limited, are reported.

  2. Metabolome analysis of Arabidopsis thaliana roots identifies a key metabolic pathway for iron acquisition.

    PubMed

    Schmidt, Holger; Günther, Carmen; Weber, Michael; Spörlein, Cornelia; Loscher, Sebastian; Böttcher, Christoph; Schobert, Rainer; Clemens, Stephan

    2014-01-01

    Fe deficiency compromises both human health and plant productivity. Thus, it is important to understand plant Fe acquisition strategies for the development of crop plants which are more Fe-efficient under Fe-limited conditions, such as alkaline soils, and have higher Fe density in their edible tissues. Root secretion of phenolic compounds has long been hypothesized to be a component of the reduction strategy of Fe acquisition in non-graminaceous plants. We therefore subjected roots of Arabidopsis thaliana plants grown under Fe-replete and Fe-deplete conditions to comprehensive metabolome analysis by gas chromatography-mass spectrometry and ultra-pressure liquid chromatography electrospray ionization quadrupole time-of-flight mass spectrometry. Scopoletin and other coumarins were found among the metabolites showing the strongest response to two different Fe-limited conditions, the cultivation in Fe-free medium and in medium with an alkaline pH. A coumarin biosynthesis mutant defective in ortho-hydroxylation of cinnamic acids was unable to grow on alkaline soil in the absence of Fe fertilization. Co-cultivation with wild-type plants partially rescued the Fe deficiency phenotype indicating a contribution of extracellular coumarins to Fe solubilization. Indeed, coumarins were detected in root exudates of wild-type plants. Direct infusion mass spectrometry as well as UV/vis spectroscopy indicated that coumarins are acting both as reductants of Fe(III) and as ligands of Fe(II). PMID:25058345

  3. Metabolome Analysis of Arabidopsis thaliana Roots Identifies a Key Metabolic Pathway for Iron Acquisition

    PubMed Central

    Schmidt, Holger; Günther, Carmen; Weber, Michael; Spörlein, Cornelia; Loscher, Sebastian; Böttcher, Christoph; Schobert, Rainer; Clemens, Stephan

    2014-01-01

    Fe deficiency compromises both human health and plant productivity. Thus, it is important to understand plant Fe acquisition strategies for the development of crop plants which are more Fe-efficient under Fe-limited conditions, such as alkaline soils, and have higher Fe density in their edible tissues. Root secretion of phenolic compounds has long been hypothesized to be a component of the reduction strategy of Fe acquisition in non-graminaceous plants. We therefore subjected roots of Arabidopsis thaliana plants grown under Fe-replete and Fe-deplete conditions to comprehensive metabolome analysis by gas chromatography-mass spectrometry and ultra-pressure liquid chromatography electrospray ionization quadrupole time-of-flight mass spectrometry. Scopoletin and other coumarins were found among the metabolites showing the strongest response to two different Fe-limited conditions, the cultivation in Fe-free medium and in medium with an alkaline pH. A coumarin biosynthesis mutant defective in ortho-hydroxylation of cinnamic acids was unable to grow on alkaline soil in the absence of Fe fertilization. Co-cultivation with wild-type plants partially rescued the Fe deficiency phenotype indicating a contribution of extracellular coumarins to Fe solubilization. Indeed, coumarins were detected in root exudates of wild-type plants. Direct infusion mass spectrometry as well as UV/vis spectroscopy indicated that coumarins are acting both as reductants of Fe(III) and as ligands of Fe(II). PMID:25058345

  4. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  5. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. II. Analysis and applications

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We use an analytical theory of noisy Poisson processes, developed in the preceding companion publication, to compare coincidence and covariance measurement approaches in photoelectron and -ion spectroscopy. For non-unit detection efficiencies, coincidence data acquisition (DAQ) suffers from false coincidences. The rate of false coincidences grows quadratically with the rate of elementary ionization events. To minimize false coincidences for rare event outcomes, very low event rates may hence be required. Coincidence measurements exhibit high tolerance to noise introduced by unstable experimental conditions. Covariance DAQ on the other hand is free of systematic errors as long as stable experimental conditions are maintained. In the presence of noise, all channels in a covariance measurement become correlated. Under favourable conditions, covariance DAQ may allow orders of magnitude reduction in measurement times. Finally, we use experimental data for strong-field ionization of 1,3-butadiene to illustrate how fluctuations in experimental conditions can contaminate a covariance measurement, and how such contamination can be detected.

  6. The rate of acquisition of formal operational schemata in adolescence: A secondary analysis

    NASA Astrophysics Data System (ADS)

    Eckstein, Shulamith G.; Shemesh, Michal

    A theoretical model of cognitive development is applied to the study of the acquisition of formal operational schemata by adolescents. The model predicts that the proportion of adolescents who have not yet acquired the ability to perform a a specific Piagetian-like task is an exponentially decreasing function of age. The model has been used to analyze the data of two large-scale studies performed in the United States and in Israel. The functional dependence upon age was found to be the same in both countries for tasks which are used to assess the following formal operations: proportional reasoning, probabilistic reasoning, correlations, and combinatorial analysis. Different functional dependence was found for tasks assessing conservation, control of variables, and prepositional logic. These results give support to the unity hypothesis of cognitive development, that is, the hypothesis that the various schemata of formal thought appear simultaneously.

  7. The use of an optical data acquisition system for bladed disk vibration analysis

    NASA Technical Reports Server (NTRS)

    Lawrence, C.; Meyn, E. H.

    1984-01-01

    A new concept in instrumentation was developed by engineers at NASA Lewis Research Center to collect vibration data from multi-bladed rotors. This new concept, known as the optical data acquisition system, uses optical transducers to measure bladed tip delections by reflection light beams off the tips of the blades as they pass in front of the optical transducer. By using an array of transducers around the perimeter of the rotor, detailed vibration signals can be obtained. In this study, resonant frequencies and mode shapes were determined for a 56 bladed rotor using the optical system. Frequency data from the optical system was also compared to data obtained from strain gauge measurements and finite element analysis and was found to be in good agreement.

  8. The use of an optical data acquisition system for bladed disk vibration analysis

    NASA Technical Reports Server (NTRS)

    Lawrence, C.; Meyn, E. H.

    1985-01-01

    A new concept in instrumentation was developed by engineers at NASA Lewis Research Center to collect vibration data from multi-bladed rotors. This new concept, known as the optical data acquisition system, uses optical transducers to measure bladed tip deflections by reflection of light beams off the tips of the blades as they pass in front of the optical transducer. By using an array of transducers around the perimeter of the rotor, detailed vibration signals can be obtained. In this study, resonant frequencies and mode shapes were determined for a 56 bladed rotor using the optical system. Frequency data from the optical system was also compared to data obtained from strain gauge measurements and finite element analysis and was found to be in good agreement.

  9. Financial analysis of technology acquisition using fractionated lasers as a model.

    PubMed

    Jutkowitz, Eric; Carniol, Paul J; Carniol, Alan R

    2010-08-01

    Ablative fractional lasers are among the most advanced and costly devices on the market. Yet, there is a dearth of published literature on the cost and potential return on investment (ROI) of such devices. The objective of this study was to provide a methodological framework for physicians to evaluate ROI. To facilitate this analysis, we conducted a case study on the potential ROI of eight ablative fractional lasers. In the base case analysis, a 5-year lease and a 3-year lease were assumed as the purchase option with a $0 down payment and 3-month payment deferral. In addition to lease payments, service contracts, labor cost, and disposables were included in the total cost estimate. Revenue was estimated as price per procedure multiplied by total number of procedures in a year. Sensitivity analyses were performed to account for variability in model assumptions. Based on the assumptions of the model, all lasers had higher ROI under the 5-year lease agreement compared with that for the 3-year lease agreement. When comparing results between lasers, those with lower operating and purchase cost delivered a higher ROI. Sensitivity analysis indicates the model is most sensitive to purchase method. If physicians opt to purchase the device rather than lease, they can significantly enhance ROI. ROI analysis is an important tool for physicians who are considering making an expensive device acquisition. However, physicians should not rely solely on ROI and must also consider the clinical benefits of a laser.

  10. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... Wide Area WorkFlow to process vouchers. DATES: Comments on the proposed rule should be submitted in... updates DoD's internal voucher processing procedures and better accommodates the use of Wide Area...

  11. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    SciTech Connect

    A.E. Danise

    2004-10-25

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations.

  12. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    ERIC Educational Resources Information Center

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  13. An Independent Workstation For CT Image Processing And Analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Sewchand, Wilfred

    1988-06-01

    This manuscript describes an independent workstation which consists of a data acquisition and transfer system, a host computer, and a display and record system. The main tasks of the workstation include the collecting and managing of a vast amount of data, creating and processing 2-D and 3-D images, conducting quantitative data analysis, and recording and exchanging information. This workstation not only meets the requirements for routine clinical applications, but it is also used extensively for research purposes. It is stand-alone and works as a physician's workstation; moreover, it can be easily linked into a computer-network and serve as a component of PACS (Picture Archiving and Communication System).

  14. Lexical Processing and Organization in Bilingual First Language Acquisition: Guiding Future Research

    PubMed Central

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-01-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between two languages in the early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. PMID:26866430

  15. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach.

  16. Lexical processing and organization in bilingual first language acquisition: Guiding future research.

    PubMed

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-06-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between 2 languages in early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. (PsycINFO Database Record

  17. Acquisition process of typing skill using hierarchical materials in the Japanese language.

    PubMed

    Ashitaka, Yuki; Shimada, Hiroyuki

    2014-08-01

    In the present study, using a new keyboard layout with only eight keys, we conducted typing training for unskilled typists. In this task, Japanese college students received training in typing words consisting of a pair of hiragana characters with four keystrokes, using the alphabetic input method, while keeping the association between the keys and typists' finger movements; the task was constructed so that chunking was readily available. We manipulated the association between the hiragana characters and alphabet letters (hierarchical materials: overlapped and nonoverlapped mappings). Our alphabet letter materials corresponded to the regular order within each hiragana word (within the four letters, the first and third referred to consonants, and the second and fourth referred to vowels). Only the interkeystroke intervals involved in the initiation of typing vowel letters showed an overlapping effect, which revealed that the effect was markedly large only during the early period of skill development (the effect for the overlapped mapping being larger than that for the nonoverlapped mapping), but that it had diminished by the time of late training. Conversely, the response time and the third interkeystroke interval, which are both involved in the latency of typing a consonant letter, did not reveal an overlapped effect, suggesting that chunking might be useful with hiragana characters rather than hiragana words. These results are discussed in terms of the fan effect and skill acquisition. Furthermore, we discuss whether there is a need for further research on unskilled and skilled Japanese typists.

  18. Individual and social learning processes involved in the acquisition and generalization of tool use in macaques

    PubMed Central

    Macellini, S.; Maranesi, M.; Bonini, L.; Simone, L.; Rozzi, S.; Ferrari, P. F.; Fogassi, L.

    2012-01-01

    Macaques can efficiently use several tools, but their capacity to discriminate the relevant physical features of a tool and the social factors contributing to their acquisition are still poorly explored. In a series of studies, we investigated macaques' ability to generalize the use of a stick as a tool to new objects having different physical features (study 1), or to new contexts, requiring them to adapt the previously learned motor strategy (study 2). We then assessed whether the observation of a skilled model might facilitate tool-use learning by naive observer monkeys (study 3). Results of study 1 and study 2 showed that monkeys trained to use a tool generalize this ability to tools of different shape and length, and learn to adapt their motor strategy to a new task. Study 3 demonstrated that observing a skilled model increases the observers' manipulations of a stick, thus facilitating the individual discovery of the relevant properties of this object as a tool. These findings support the view that in macaques, the motor system can be modified through tool use and that it has a limited capacity to adjust the learnt motor skills to a new context. Social factors, although important to facilitate the interaction with tools, are not crucial for tool-use learning. PMID:22106424

  19. Circumcision Status and Risk of HIV Acquisition during Heterosexual Intercourse for Both Males and Females: A Meta-Analysis

    PubMed Central

    WEI, Qiang; YANG, Lu; SONG, Tu run; YUAN, Hai chao; LV, Xiao; HAN, Ping

    2015-01-01

    In this study, we evaluated if male circumcision was associated with lower HIV acquisition for HIV (−) males and HIV (−) females during normal sexual behavior. We performed a systematic literature search of PubMed, EMBASE, and Cochrane Central Register of Controlled Trials (CENTRAL) databases to identify studies that compared HIV acquisition for the circumcised and uncircumcised groups. The reference lists of the included and excluded studies were also screened. Fifteen studies (4 RCTs and 11 prospective cohort studies) were included, and the related data were extracted and analyzed in a meta-analysis. Our study revealed strong evidence that male circumcision was associated with reduced HIV acquisition for HIV(−) males during sexual intercourse with females [pooled adjusted risk ratio (RR): 0.30, 95% CI 0.24 0.38, P < 0.00001] and provided a 70% protective effect. In contrast, no difference was detected in HIV acquisition for HIV (−) females between the circumcised and uncircumcised groups (pooled adjusted RR after sensitivity analysis: 0.68, 95%CI 0.40–1.15, P = 0.15). In conclusion, male circumcision could significantly protect males but not females from HIV acquisition at the population level. Male circumcision may serve as an additional approach toward HIV control, in conjunction with other strategies such as HIV counseling and testing, condom promotion, and so on. PMID:25942703

  20. Transcriptomic analysis highlights reciprocal interactions of urea and nitrate for nitrogen acquisition by maize roots.

    PubMed

    Zanin, Laura; Zamboni, Anita; Monte, Rossella; Tomasi, Nicola; Varanini, Zeno; Cesco, Stefano; Pinton, Roberto

    2015-03-01

    Even though urea and nitrate are the two major nitrogen (N) forms applied as fertilizers in agriculture and occur concomitantly in soils, the reciprocal influence of these two N sources on the mechanisms of their acquisition are poorly understood. Therefore, molecular and physiological aspects of urea and nitrate uptake were investigated in maize (Zea mays), a crop plant consuming high amounts of N. In roots, urea uptake was stimulated by the presence of urea in the external solution, indicating the presence of an inducible transport system. On the other hand, the presence of nitrate depressed the induction of urea uptake and, at the same time, the induction of nitrate uptake was depressed by the presence of urea. The expression of about 60,000 transcripts of maize in roots was monitored by microarray analyses and the transcriptional patterns of those genes involved in nitrogen acquisition were analyzed by real-time reverse transcription-PCR (RT-PCR). In comparison with the treatment without added N, the exposure of maize roots to urea modulated the expression of only very few genes, such as asparagine synthase. On the other hand, the concomitant presence of urea and nitrate enhanced the overexpression of genes involved in nitrate transport (NRT2) and assimilation (nitrate and nitrite reductase, glutamine synthetase 2), and a specific response of 41 transcripts was determined, including glutamine synthetase 1-5, glutamine oxoglutarate aminotransferase, shikimate kinase and arogenate dehydrogenase. Also based on the real-time RT-PCR analysis, the transcriptional modulation induced by both sources might determine an increase in N metabolism promoting a more efficient assimilation of the N that is taken up. PMID:25524070

  1. Transcriptomic analysis highlights reciprocal interactions of urea and nitrate for nitrogen acquisition by maize roots.

    PubMed

    Zanin, Laura; Zamboni, Anita; Monte, Rossella; Tomasi, Nicola; Varanini, Zeno; Cesco, Stefano; Pinton, Roberto

    2015-03-01

    Even though urea and nitrate are the two major nitrogen (N) forms applied as fertilizers in agriculture and occur concomitantly in soils, the reciprocal influence of these two N sources on the mechanisms of their acquisition are poorly understood. Therefore, molecular and physiological aspects of urea and nitrate uptake were investigated in maize (Zea mays), a crop plant consuming high amounts of N. In roots, urea uptake was stimulated by the presence of urea in the external solution, indicating the presence of an inducible transport system. On the other hand, the presence of nitrate depressed the induction of urea uptake and, at the same time, the induction of nitrate uptake was depressed by the presence of urea. The expression of about 60,000 transcripts of maize in roots was monitored by microarray analyses and the transcriptional patterns of those genes involved in nitrogen acquisition were analyzed by real-time reverse transcription-PCR (RT-PCR). In comparison with the treatment without added N, the exposure of maize roots to urea modulated the expression of only very few genes, such as asparagine synthase. On the other hand, the concomitant presence of urea and nitrate enhanced the overexpression of genes involved in nitrate transport (NRT2) and assimilation (nitrate and nitrite reductase, glutamine synthetase 2), and a specific response of 41 transcripts was determined, including glutamine synthetase 1-5, glutamine oxoglutarate aminotransferase, shikimate kinase and arogenate dehydrogenase. Also based on the real-time RT-PCR analysis, the transcriptional modulation induced by both sources might determine an increase in N metabolism promoting a more efficient assimilation of the N that is taken up.

  2. Systematic derivatization, mass fragmentation and acquisition studies in the analysis of chlorophenols, as their silyl derivatives by gas chromatography-mass spectrometry.

    PubMed

    Faludi, T; Andrási, N; Vasanits-Zsigrai, A; Záray, Gy; Molnár-Perl, I

    2013-08-01

    An exhaustive GC-MS sample preparation, derivatization, mass fragmentation and acquisition study was performed, for the simultaneous analysis of chlorophenols (CPs). Selected species were 2-CP, 3-CP, 4-CP, 3,5-dichlorophenol (diCP), 2,5-diCP, 2,6-diCP, 2,4-diCP, 2,3-diCP, 3,4-diCP 2,4,6-trichlorophenol (triCP), 2,4,5-triCP, 2,3,4-triCP, 2,3,4,6-tetrachlorophenol (tetraCP) and pentachlorophenol (pentaCP), in total 14 compounds. As novelties to the field, basic researches, like systematic derivatization, mass fragmentation and acquisition methods have been optimized for the trimethylsilyl (TMS) ether derivatives of CPs. The reactivity of chlorophenols with silylating agents has not been systematically analyzed. Here, we studied the reactivity of 14 chlorophenols with five silylating reagents. The three acquisition techniques, the full scan (FS), the multiple ion monitoring (MIM), and the currently optimized multiple reaction monitoring (MRM) methods, have been compared. We developed a new analytical approach, simultaneously monitoring the fragmentation pattern of the (35)Cl and the (37)Cl containing fragment ions both as precursor and as product ions. This principle resulted in remarkable specificity and sensitivity of detection and quantification; particularly in the cases of the tetraCP and pentaCP derivatives containing the (35)Cl and the (37)Cl fragment ions at an approximate ratio of <1:1. Detailed documentation of the loss of HCl via fragmentation processes, without decomposition of the benzene ring, was attributed to the "ring-walk" mechanism described first for monochlorophenol. Critical evaluation of the derivatization and acquisition protocols was collated and validated with the same characteristics. Data of six point calibration along with the corresponding relative standard deviation percentage (RSD%) values, in the line of FS, MIM and MRM methods (r(2): 0.9987, 0.9992, 0.9989; RSD%: 8.7, 5.6, 8.1), proved to be independent on the acquisition processes

  3. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    ERIC Educational Resources Information Center

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2013-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with…

  4. A real-time satellite data acquisition, analysis and display system - A practical application of the GOES network

    NASA Technical Reports Server (NTRS)

    Sutherland, R. A.; Langford, J. L.; Bartholic, J. F.; Bill, R. G., Jr.

    1979-01-01

    A real-time satellite data acquisition, analysis and display system is described which uses analog data transmitted by telephone line over the GOES network. Results are displayed on the system color video monitor as 'thermal' images which originated from infrared surface radiation sensed by the Geostationary Operational Environmental Satellite (GOES).

  5. 77 FR 26009 - CoStar Group, Inc., Lonestar Acquisition Sub, Inc., and LoopNet, Inc.; Analysis of Agreement...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-02

    ...Star Group, Inc., Lonestar Acquisition Sub, Inc., and LoopNet, Inc.; Analysis of Agreement Containing... ACoStar LoopNet, File No. 111 0172'' on your comment, and file your comment online at https... April 16, 2012. Write ACoStar LoopNet, File No. 111 0172'' on your comment. Your comment B...

  6. Acquisition of spatial knowledge in different urban areas: evidence from a survey analysis of adolescents.

    PubMed

    Burgmanis, Ģirts; Krišjāne, Zaiga; Šķilters, Jurģis

    2014-08-01

    We herein explore the perception of the geographic environment and analyse the mechanisms that constrain the cognitive processing of spatial information in general. Our guiding theoretical background assumption is that the structure of the spatial environment is a cognitively robust and mutually constrained threefold system relating (1) cognitive topology (comprised of a path and place structure of spatial information and constrained by reference frame-based factors), (2) experience-based functional knowledge (including the effects of socio-economic factors, frequency and familiarity) and (3) linguistic representations (primarily encoded in the prepositional system of a natural language). Here, we focus on (2), i.e. the effect of functional knowledge on the process of acquiring spatial knowledge. We empirically tested adolescents aged 12–17 years to explore the interaction between frequency, familiarity and functional knowledge from a developmental point of view. The social factors we explore are precisely defined and parameterized in our results (exposure to a particular urban area, place of residence, gender, age and factors relating to the environmental and social quality of the local area). Our research shows that there are divergences between the so called objective topology and the cognitive typology of the urban environment that are significantly constrained by intensity of interactions with environment, number of functionally significant places within particular area and age from a developmental perspective in terms of spatial knowledge acquisition.

  7. Optimal Diphthongs: An OT Analysis of the Acquisition of Spanish Diphthongs

    ERIC Educational Resources Information Center

    Krause, Alice

    2013-01-01

    This dissertation investigates the acquisition of Spanish diphthongs by adult native speakers of English. The following research questions will be addressed: 1) How do adult native speakers of English pronounce sequences of two vowels in their L2 Spanish at different levels of acquisition? 2) Can OT learnability models, specifically the GLA,…

  8. Learned Attention in Adult Language Acquisition: A Replication and Generalization Study and Meta-Analysis

    ERIC Educational Resources Information Center

    Ellis, Nick C.; Sagarra, Nuria

    2011-01-01

    This study investigates associative learning explanations of the limited attainment of adult compared to child language acquisition in terms of learned attention to cues. It replicates and extends Ellis and Sagarra (2010) in demonstrating short- and long-term learned attention in the acquisition of temporal reference in Latin. In Experiment 1,…

  9. Library Catalog Log Analysis in E-Book Patron-Driven Acquisitions (PDA): A Case Study

    ERIC Educational Resources Information Center

    Urbano, Cristóbal; Zhang, Yin; Downey, Kay; Klingler, Thomas

    2015-01-01

    Patron-Driven Acquisitions (PDA) is a new model used for e-book acquisition by academic libraries. A key component of this model is to make records of ebooks available in a library catalog and let actual patron usage decide whether or not an item is purchased. However, there has been a lack of research examining the role of the library catalog as…

  10. FTMP data acquisition environment

    NASA Technical Reports Server (NTRS)

    Padilla, Peter A.

    1988-01-01

    The Fault-Tolerant Multi-Processing (FTMP) test-bed data acquisition environment is described. The performance of two data acquisition devices available in the test environment are estimated and compared. These estimated data rates are used as measures of the devices' capabilities. A new data acquisition device was developed and added to the FTMP environment. This path increases the data rate available by approximately a factor of 8, to 379 KW/S, while simplifying the experiment development process.

  11. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  12. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  13. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  14. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  15. 48 CFR 1318.270 - Emergency acquisition flexibilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... flexibilities. 1318.270 Section 1318.270 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CONTRACTING METHODS AND CONTRACT TYPES EMERGENCY ACQUISITIONS Emergency Acquisition Flexibilities 1318.270 Emergency acquisition flexibilities. (a) Authorizing emergency acquisition flexibilities. The process...

  16. A versatile and low-cost 3D acquisition and processing pipeline for collecting mass of archaeological findings on the field

    NASA Astrophysics Data System (ADS)

    Gattet, E.; Devogelaere, J.; Raffin, R.; Bergerot, L.; Daniel, M.; Jockey, Ph.; De Luca, L.

    2015-02-01

    In recent years, advances in the fields of photogrammetry and computer vision have produced several solutions for generating 3D reconstruction starting from simple images. Even if the potentialities of the image-based 3D reconstruction approach are nowadays very well-known in terms of reliability, accuracy and flexibility, there is still a lack of low-cost, open-source and automated solutions for collecting mass of archaeological findings, specially if one consider the real (and non theoretical) contextual aspects of a digitization campaign on the field (number of objects to acquire, available time, lighting conditions, equipment transport, budget, etc...) as well as the accuracy requirements for an in-depth shape analysis and classification purpose. In this paper we present a prototype system (integrating hardware and software) for the 3D acquisition, geometric reconstruction, documentation and archiving of large collections of archaeological findings. All the aspects of our approach are based on high-end image-based modeling techniques and designed basing on an accurate analysis of the typical field conditions of an archaeological campaign, as well as on the specific requirements of archaeological finding documentation and analysis. This paper presents all the aspects integrated into the prototype: - a hardware development of a transportable photobooth for the automated image acquisition consisting of a turntable and three DSLR controlled by a microcontroller; - an automatic image processing pipeline (based on Apero/Micmac) including mask generation, tie-point extraction, bundle adjustment, multi-view stereo correlation, point cloud generation, surface reconstruction; - a versatile (off-line/on-line) portable database for associating descriptive attributes (archaeological description) to the 3D digitizations on site; - a platform for data-gathering, archiving and sharing collections of 3D digitizations on the Web. The presentation and the assessment of this

  17. Skill acquisition with text-entry interfaces: particularly older users benefit from minimized information-processing demands.

    PubMed

    Jahn, Georg; Krems, Josef F

    2013-08-01

    Operating information technology challenges older users if it requires executive control, which generally declines with age. Especially for novel and occasional tasks, cognitive demands can be high. We demonstrate how interface design can reduce cognitive demands by studying skill acquisition with the destination entry interfaces of two customary route guidance systems. Young, middle-aged, and older adults performed manual destination entry either with a system operated with multiple buttons in a dialogue encompassing spelling and list selection, or with a system operated by a single rotary encoder, in which an intelligent speller constrained destination entry to a single line of action. Each participant performed 100 training trials. A retention test after at least 10 weeks encompassed 20 trials. The same task was performed faster, more accurately, and produced much less age-related performance differences especially at the beginning of training if interface design reduced demand for executive control, perceptual processing, and motor control. PMID:25474764

  18. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  19. Hardware acceleration of lucky-region fusion (LRF) algorithm for image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Maignan, William; Koeplinger, David; Carhart, Gary W.; Aubailly, Mathieu; Kiamilev, Fouad; Liu, J. Jiang

    2013-05-01

    "Lucky-region fusion" (LRF) is an image processing technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm extracts sharp regions of an image obtained from a series of short exposure frames, and "fuses" them into a final image with improved quality. In previous research, the LRF algorithm had been implemented on a PC using a compiled programming language. However, the PC usually does not have sufficient processing power to handle real-time extraction, processing and reduction required when the LRF algorithm is applied not to single picture images but rather to real-time video from fast, high-resolution image sensors. This paper describes a hardware implementation of the LRF algorithm on a Virtex 6 field programmable gate array (FPGA) to achieve real-time video processing. The novelty in our approach is the creation of a "black box" LRF video processing system with a standard camera link input, a user controller interface, and a standard camera link output.

  20. Thermal infrared pushbroom imagery acquisition and processing. [of NASA's Advanced Land Observing System

    NASA Technical Reports Server (NTRS)

    Brown, T. J.; Corbett, F. J.; Spera, T. J.; Andrada, T.

    1982-01-01

    A 9-element focal plane detector array and signal processing electronics was developed and delivered in December 1977. It was integrated into a thermal infrared imaging system using LSI microprocessor image processing and CRT display. After three years of laboratory operation, the focal plane has demonstrated high reliability and performance. On the basis of the 9-channel breadboard, the 90-element Aircraft Pushbroom IR/CCD Focal Plane Development Program was funded in October 1977. A follow-on program was awarded in July 1979, for the construction of a field test instrument and image processing facility. The objective of this project was to demonstrate thermal infrared pushbroom hard-copy imagery. It is pointed out that the successful development of the 9-element and 90-element thermal infrared hybrid imaging systems using photoconductive (Hg,Cd)Te has verified the operational concept of 8 to 14 micrometer pushbroom scanners.

  1. Fast multi-dimensional NMR acquisition and processing using the sparse FFT.

    PubMed

    Hassanieh, Haitham; Mayzel, Maxim; Shi, Lixin; Katabi, Dina; Orekhov, Vladislav Yu

    2015-09-01

    Increasing the dimensionality of NMR experiments strongly enhances the spectral resolution and provides invaluable direct information about atomic interactions. However, the price tag is high: long measurement times and heavy requirements on the computation power and data storage. We introduce sparse fast Fourier transform as a new method of NMR signal collection and processing, which is capable of reconstructing high quality spectra of large size and dimensionality with short measurement times, faster computations than the fast Fourier transform, and minimal storage for processing and handling of sparse spectra. The new algorithm is described and demonstrated for a 4D BEST-HNCOCA spectrum. PMID:26123316

  2. Data acquisition and processing history for the Explorer 33 (AIMP-D) satellite

    NASA Technical Reports Server (NTRS)

    Karras, T. J.

    1972-01-01

    The quality control monitoring system, using accounting and quality control data bases, made it possible to perform an in-depth analysis. Results show that the percentage of useable data files for experimenter analysis was 97.7%; only 0.4% of the data sequences supplied to the experimenter exhibited missing data. The 50 percentile probability delay values (referenced to station record data) indicate that the analog tapes arrived within 11 days, the data were digitized within 4.2 weeks, and the experimenter tapes were delivered in 8.95 weeks or less.

  3. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds. PMID:25528287

  4. The acceleration of spoken-word processing in children's native-language acquisition: an ERP cohort study.

    PubMed

    Ojima, Shiro; Matsuba-Kurita, Hiroko; Nakamura, Naoko; Hagiwara, Hiroko

    2011-04-01

    Healthy adults can identify spoken words at a remarkable speed, by incrementally analyzing word-onset information. It is currently unknown how this adult-level speed of spoken-word processing emerges during children's native-language acquisition. In a picture-word mismatch paradigm, we manipulated the semantic congruency between picture contexts and spoken words, and recorded event-related potential (ERP) responses to the words. Previous similar studies focused on the N400 response, but we focused instead on the onsets of semantic congruency effects (N200 or Phonological Mismatch Negativity), which contain critical information for incremental spoken-word processing. We analyzed ERPs obtained longitudinally from two age cohorts of 40 primary-school children (total n=80) in a 3-year period. Children first tested at 7 years of age showed earlier onsets of congruency effects (by approximately 70ms) when tested 2 years later (i.e., at age 9). Children first tested at 9 years of age did not show such shortening of onset latencies 2 years later (i.e., at age 11). Overall, children's onset latencies at age 9 appeared similar to those of adults. These data challenge the previous hypothesis that word processing is well established at age 7. Instead they support the view that the acceleration of spoken-word processing continues beyond age 7.

  5. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds.

  6. IWTU Process Sample Analysis Report

    SciTech Connect

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  7. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  8. The Effectiveness of Processing Instruction in L2 Grammar Acquisition: A Narrative Review

    ERIC Educational Resources Information Center

    Dekeyser, Robert; Botana, Goretti Prieto

    2015-01-01

    The past two decades have seen ample debate about processing instruction (PI) and its various components. In this article, we first describe what PI consists of and then address three questions: about the role of explicit information (EI) in PI, the difference between PI and teaching that incorporates production-based (PB) practice, and various…

  9. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    ERIC Educational Resources Information Center

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  10. The RFP Process: Effective Management of the Acquisition of Library Materials.

    ERIC Educational Resources Information Center

    Wilkinson, Frances C.; Thorson, Connie Capers

    Many librarians view procurement, with its myriad forms, procedures, and other organizational requirements, as a tedious or daunting challenge. This book simplifies the process, showing librarians how to successfully prepare a Request for Proposal (RFP) and make informed decisions when determining which vendors to use for purchasing library…

  11. Using Eye-Tracking to Investigate Topics in L2 Acquisition and L2 Processing

    ERIC Educational Resources Information Center

    Roberts, Leah; Siyanova-Chanturia, Anna

    2013-01-01

    Second language (L2) researchers are becoming more interested in both L2 learners' knowledge of the target language and how that knowledge is put to use during real-time language processing. Researchers are therefore beginning to see the importance of combining traditional L2 research methods with those that capture the moment-by-moment…

  12. Production and Processing Asymmetries in the Acquisition of Tense Morphology by Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Chondrogianni, Vasiliki; Marinis, Theodoros

    2012-01-01

    This study investigates the production and online processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty-nine six- to nine-year-old L2 children and twenty-eight typically developing age-matched monolingual (L1) children were administered the production…

  13. Analyzing Preschoolers' Overgeneralizations of Object Labeling in the Process of Mother-Tongue Acquisition in Turkey

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2006-01-01

    Language, as is known, is acquired under certain conditions: rapid and sequential brain maturation and cognitive development, the need to exchange information and to control others' actions, and an exposure to appropriate speech input. This research aims at analyzing preschoolers' overgeneralizations of the object labeling process in different…

  14. The Effect of Processing Instruction and Dictogloss Tasks on Acquisition of the English Passive Voice

    ERIC Educational Resources Information Center

    Qin, Jingjing

    2008-01-01

    This study was intended to compare processing instruction (VanPatten, 1993, 1996, 2000), an input-based focus on form technique, to dictogloss tasks, an output-oriented focus-on-form type of instruction to assess their effects in helping beginning-EFL (English as a Foreign Language) learners acquire the simple English passive voice. Two intact…

  15. Human resource processes and the role of the human resources function during mergers and acquisitions in the electricity industry

    NASA Astrophysics Data System (ADS)

    Dass, Ted K.

    Mergers and acquisitions (M&A) have been a popular strategy for organizations to consolidate and grow for more than a century. However, research in this field indicates that M&A are more likely to fail than succeed, with failure rates estimated to be as high as 75%. People-related issues have been identified as important causes for the high failure rate, but these issues are largely neglected until after the deal is closed. One explanation for this neglect is the low involvement of human resource (HR) professionals and the HR function during the M&A process. The strategic HR management literature suggests that a larger role for HR professionals in the M&A process would enable organizations to identify potential problems early and devise appropriate solutions. However, empirical research from an HR perspective has been scarce in this area. This dissertation examines the role of the HR function and the HR processes followed in organizations during M&A. Employing a case-study research design, this study examines M&A undertaken by two large organizations in the electricity industry through the lens of a "process" perspective. Based on converging evidence, the case studies address three sets of related issues: (1) how do organizations undertake and manage M&A; (2) what is the extent of HR involvement in M&A and what role does it play in the M&A process; and (3) what factors explain HR involvement in the M&A process and, more generally, in the formulation of corporate goals and strategies. Results reveal the complexity of issues faced by organizations in undertaking M&A, the variety of roles played by HR professionals, and the importance of several key contextual factors---internal and external to the organization---that influence HR involvement in the M&A process. Further, several implications for practice and future research are explored.

  16. Intonational Phrase Structure Processing at Different Stages of Syntax Acquisition: ERP Studies in 2-, 3-, and 6-Year-Old Children

    ERIC Educational Resources Information Center

    Mannel, Claudia; Friederici, Angela D.

    2011-01-01

    This study explored the electrophysiology underlying intonational phrase processing at different stages of syntax acquisition. Developmental studies suggest that children's syntactic skills advance significantly between 2 and 3 years of age. Here, children of three age groups were tested on phrase-level prosodic processing before and after this…

  17. Photogrammetric 3d Acquisition and Analysis of Medicamentous Induced Pilomotor Reflex ("goose Bumps")

    NASA Astrophysics Data System (ADS)

    Schneider, D.; Hecht, A.

    2016-06-01

    In a current study at the University Hospital Dresden, Department of Neurology, the autonomous function of nerve fibres of the human skin is investigated. For this purpose, a specific medicament is applied on a small area of the skin of a test person which results in a local reaction (goose bumps). Based on the extent of the area, where the stimulation of the nerve fibres is visible, it can be concluded how the nerve function of the skin works. The aim of the investigation described in the paper is to generate 3D data of these goose bumps. Therefore, the paper analyses and compares different photogrammetric surface measurement techniques in regard to their suitability for the 3D acquisition of silicone imprints of the human skin. Furthermore, an appropriate processing procedure for analysing the recorded point cloud data is developed and presented. It was experimentally proven that by using (low-cost) photogrammetric techniques medicamentous induced goose bumps can be acquired in three dimensions and can be analysed almost fully automatically from the perspective of medical research questions. The relative accuracy was determined with 1% (RMSE) of the area resp. the volume of an individual goose bump.

  18. New Spill Structure Analysis Tools for the VME Based Data Acquisition System ABLASS at GSI

    NASA Astrophysics Data System (ADS)

    Hoffmann, T.; Forck, P.; Liakin, D. A.

    2006-11-01

    During the last years, a comprehensive VME-based data acquisition system for counter applications was developed. This package called ABLASS (A Beam Loss measurement And Scaling System) is used at the GSI heavy ion synchrotron (SIS18), at the high energy beam transfer lines (HEBT) and at the connected experiments. To achieve a maximum of experimental rate capability and to protect sensitive targets from significant intensity peaks, the particle distribution within the slowly extracted bunched beam has to be qualified. To analyze this spill-microstructure by means of scintillator pulses, new sophisticated tools were invented, such as Q-Analysis, which generates evaluated data in the μS-region. To measure the time distribution of the particles relative to the bunching RF-phase and the probability curve of consecutive particle hits in the ns-regime, a VME Multihit TDC with 25ps time resolution was implemented into ABLASS. The principle and outcome of these new methods substantiated by actual ion beam data will be presented.

  19. Molecular analysis of nuc-1+, a gene controlling phosphorus acquisition in Neurospora crassa.

    PubMed

    Kang, S; Metzenberg, R L

    1990-11-01

    In response to phosphorus starvation, Neurospora crassa makes several enzymes that are undetectable or barely detectable in phosphate-sufficient cultures. The nuc-1+ gene, whose product regulates the synthesis of these enzymes, was cloned and sequenced. The nuc-1+ gene encodes a protein of 824 amino acids with a predicted molecular weight of 87,429. The amino acid sequence shows homology with two yeast proteins whose functions are analogous to that of the NUC-1 protein. Two nuc-1+ transcripts of 3.2 and 3.0 kilobases were detected; they were present in similar amounts during growth at low or high phosphate concentrations. The nuc-2+ gene encodes a product normally required for NUC-1 function, and yet a nuc-2 mutation can be complemented by overexpression of the nuc-1+ gene. This implies physical interactions between NUC-1 protein and the negative regulatory factor(s) PREG and/or PGOV. Analysis of nuc-2 and nuc-1; nuc-2 strains transformed by the nuc-1+ gene suggests that phosphate directly affects the level or activity of the negative regulatory factor(s) controlling phosphorus acquisition.

  20. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  1. Success or Failure of Automated Data Processing Systems in Physicians' Offices after System Acquisition

    PubMed Central

    Dahm, Lisa L.

    1983-01-01

    Although many sources exist for gleaning information relative to acquiring a data processing system, less material is available on the subject of what the purchaser may expect and must do following the sale. The ingredients for successfully automating a medical practice include: a good plan for the conversion and on-going use of the automated system; proper training initially and plans for future training should the need arise; proper physical facilities; and a positive and cooperative attitude.

  2. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  3. A tailored 200 parameter VME based data acquisition system for IBA at the Lund Ion Beam Analysis Facility - Hardware and software

    NASA Astrophysics Data System (ADS)

    Elfman, Mikael; Ros, Linus; Kristiansson, Per; Nilsson, E. J. Charlotta; Pallon, Jan

    2016-03-01

    With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC's), Charge to Digital Converters (QDC's), Time to Digital Converters (TDC's), scaler's, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.

  4. The effects of an action's "age-of-acquisition" on action-sentence processing.

    PubMed

    Gilead, Michael; Liberman, Nira; Maril, Anat

    2016-11-01

    How does our brain allow us comprehend abstract/symbolic descriptions of human action? Whereas past research suggested that processing action language relies on sensorimotor brain regions, recent work suggests that sensorimotor activation depends on participants' task goals, such that focusing on abstract (vs. concrete) aspects of an action activates "default mode network" (rather than sensorimotor) regions. Following a Piagetian framework, we hypothesized that for actions acquired at an age wherein abstract/symbolic cognition is fully-developed, even when participants focus on the concrete aspects of an action, they should retrieve abstract-symbolic mental representations. In two studies, participants processed the concrete (i.e., "how") and abstract (i.e., "why") aspects of late-acquired and early-acquired actions. Consistent with previous research, focusing on the abstract (vs. concrete) aspects of an action resulted in greater activation in the "default mode network". Importantly, the activation in these regions was higher when processing later-acquired (vs. earlier acquired) actions-also when participants' goal was to focus on the concrete aspects of the action. We discuss the implications of the current findings to research on the involvement of concrete representations in abstract cognition. PMID:27431759

  5. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications.

    PubMed

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  6. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  7. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  8. Processing Temporal Constraints and Some Implications for the Investigation of Second Language Sentence Processing and Acquisition. Commentary on Baggio

    ERIC Educational Resources Information Center

    Roberts, Leah

    2008-01-01

    Baggio presents the results of an event-related potential (ERP) study in which he examines the processing consequences of reading tense violations such as *"Afgelopen zondag lakt Vincent de kozijnen van zijn landhuis" (*"Last Sunday Vincent paints the window-frames of his country house"). The violation is arguably caused by a mismatch between the…

  9. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  10. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received.

  11. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Astrophysics Data System (ADS)

    Li, Y. T.; Wittenberg, L. J.

    1992-09-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  12. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Technical Reports Server (NTRS)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  13. Age of second language acquisition affects nonverbal conflict processing in children: an fMRI study

    PubMed Central

    Mohades, Seyede Ghazal; Struys, Esli; Van Schuerbeek, Peter; Baeken, Chris; Van De Craen, Piet; Luypaert, Robert

    2014-01-01

    Background In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus–stimulus and stimulus–response conflicts. Three groups of 8–11-year-old children – bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) – were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life. PMID:25328840

  14. An overview of AmeriFlux data products and methods for data acquisition, processing, and publication

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.

    2014-12-01

    The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks

  15. Applying 'Sequential Windowed Acquisition of All Theoretical Fragment Ion Mass Spectra' (SWATH) for systematic toxicological analysis with liquid chromatography-high-resolution tandem mass spectrometry.

    PubMed

    Arnhard, Kathrin; Gottschall, Anna; Pitterl, Florian; Oberacher, Herbert

    2015-01-01

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become an indispensable analytical technique in clinical and forensic toxicology for detection and identification of potentially toxic or harmful compounds. Particularly, non-target LC-MS/MS assays enable extensive and universal screening requested in systematic toxicological analysis. An integral part of the identification process is the generation of information-rich product ion spectra which can be searched against libraries of reference mass spectra. Usually, 'data-dependent acquisition' (DDA) strategies are applied for automated data acquisition. In this study, the 'data-independent acquisition' (DIA) method 'Sequential Windowed Acquisition of All Theoretical Fragment Ion Mass Spectra' (SWATH) was combined with LC-MS/MS on a quadrupole-quadrupole-time-of-flight (QqTOF) instrument for acquiring informative high-resolution tandem mass spectra. SWATH performs data-independent fragmentation of all precursor ions entering the mass spectrometer in 21m/z isolation windows. The whole m/z range of interest is covered by continuous stepping of the isolation window. This allows numerous repeat analyses of each window during the elution of a single chromatographic peak and results in a complete fragment ion map of the sample. Compounds and samples typically encountered in forensic casework were used to assess performance characteristics of LC-MS/MS with SWATH. Our experiments clearly revealed that SWATH is a sensitive and specific identification technique. SWATH is capable of identifying more compounds at lower concentration levels than DDA does. The dynamic range of SWATH was estimated to be three orders of magnitude. Furthermore, the >600,000 SWATH spectra matched led to only 408 incorrect calls (false positive rate = 0.06 %). Deconvolution of generated ion maps was found to be essential for unravelling the full identification power of LC-MS/MS with SWATH. With the available software, however, only semi

  16. Acquisition and Analysis of NASA Ames Sunphotometer Measurements during SAGE III Validation Campaigns and other Tropospheric and Stratospheric Research Missions

    NASA Technical Reports Server (NTRS)

    Livingston, John M.

    2004-01-01

    NASA Cooperative Agreement NCC2-1251 provided funding from April 2001 through December 2003 for Mr. John Livingston of SRI International to collaborate with NASA Ames Research Center scientists and engineers in the acquisition and analysis of airborne sunphotometer measurements during various atmospheric field studies. Mr. Livingston participated in instrument calibrations at Mauna Loa Observatory, pre-mission hardware and software preparations, acquisition and analysis of sunphotometer measurements during the missions, and post-mission analysis of data and reporting of scientific findings. The atmospheric field missions included the spring 2001 Intensive of the Asian Pacific Regional Aerosol Characterization Experiment (ACE-Asia), the Asian Dust Above Monterey-2003 (ADAM-2003) experiment, and the winter 2003 Second SAGE III Ozone Loss and Validation Experiment (SOLVE II).

  17. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  18. Microevolution Analysis of Bacillus coahuilensis Unveils Differences in Phosphorus Acquisition Strategies and Their Regulation.

    PubMed

    Gómez-Lunar, Zulema; Hernández-González, Ismael; Rodríguez-Torres, María-Dolores; Souza, Valeria; Olmedo-Álvarez, Gabriela

    2016-01-01

    Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution). Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México), to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2,893) constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements (MGEs) or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo) biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with MGEs. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding alkaline phosphatase that

  19. Microevolution Analysis of Bacillus coahuilensis Unveils Differences in Phosphorus Acquisition Strategies and Their Regulation

    PubMed Central

    Gómez-Lunar, Zulema; Hernández-González, Ismael; Rodríguez-Torres, María-Dolores; Souza, Valeria; Olmedo-Álvarez, Gabriela

    2016-01-01

    Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution). Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México), to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2,893) constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements (MGEs) or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo) biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with MGEs. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding alkaline phosphatase that

  20. Microevolution Analysis of Bacillus coahuilensis Unveils Differences in Phosphorus Acquisition Strategies and Their Regulation.

    PubMed

    Gómez-Lunar, Zulema; Hernández-González, Ismael; Rodríguez-Torres, María-Dolores; Souza, Valeria; Olmedo-Álvarez, Gabriela

    2016-01-01

    Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution). Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México), to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2,893) constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements (MGEs) or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo) biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with MGEs. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding alkaline phosphatase that

  1. Broadband network on-line data acquisition system with web based interface for control and basic analysis

    NASA Astrophysics Data System (ADS)

    Polkowski, Marcin; Grad, Marek

    2016-04-01

    Passive seismic experiment "13BB Star" is operated since mid 2013 in northern Poland and consists of 13 broadband seismic stations. One of the elements of this experiment is dedicated on-line data acquisition system comprised of both client (station) side and server side modules with web based interface that allows monitoring of network status and provides tools for preliminary data analysis. Station side is controlled by ARM Linux board that is programmed to maintain 3G/EDGE internet connection, receive data from digitizer, send data do central server among with additional auxiliary parameters like temperatures, voltages and electric current measurements. Station side is controlled by set of easy to install PHP scripts. Data is transmitted securely over SSH protocol to central server. Central server is a dedicated Linux based machine. Its duty is receiving and processing all data from all stations including auxiliary parameters. Server side software is written in PHP and Python. Additionally, it allows remote station configuration and provides web based interface for user friendly interaction. All collected data can be displayed for each day and station. It also allows manual creation of event oriented plots with different filtering abilities and provides numerous status and statistic information. Our solution is very flexible and easy to modify. In this presentation we would like to share our solution and experience. National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  2. Analysis of crisis intervention processes.

    PubMed

    Tschacher, Wolfgang; Jacobshagen, Nina

    2002-01-01

    The remediation processes in psychosocial crisis intervention were modeled focusing on cognitive orientation. Frequent observations and subsequent process modeling constitute a novel approach to process research and reveal process-outcome associations. A sample of 40 inpatients who were assigned to treatment in a crisis intervention unit was monitored in order to study the process of crisis intervention. The process data consisted of patients' self-ratings of the variables mood, tension, and cognitive orientation, which were assessed three times a day throughout hospitalization (M = 22.6 days). Linear time series models (vector autoregression) of the process data were computed to describe the prototypical dynamic patterns of the sample. Additionally, the outcome of crisis intervention was evaluated by pre-post questionnaires. Linear trends were found pointing to an improvement of mood, a reduction of tension, and an increase of outward cognitive orientation. Time series modeling showed that, on average, outward cognitive orientation preceded improved mood. The time series models partially predicted the treatment effect, notably the outcome domain "reduction of social anxiety," yet did not predict the domain of symptom reduction. In conclusion, crisis intervention should focus on having patients increasingly engage in outward cognitive orientation in order to stabilize mood, reduce anxiety, and activate their resources.

  3. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  4. An architecture for real time data acquisition and online signal processing for high throughput tandem mass spectrometry

    SciTech Connect

    Shah, Anuj R.; Jaitly, Navdeep; Zuljevic, Nino; Monroe, Matthew E.; Liyu, Andrei V.; Polpitiya, Ashoka D.; Adkins, Joshua N.; Belov, Mikhail E.; Anderson, Gordon A.; Smith, Richard D.; Gorton, Ian

    2010-12-09

    Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through “intelligent,” online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving “intelligent,” online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an “intelligent” online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.

  5. THE BLANCO COSMOLOGY SURVEY: DATA ACQUISITION, PROCESSING, CALIBRATION, QUALITY DIAGNOSTICS, AND DATA RELEASE

    SciTech Connect

    Desai, S.; Mohr, J. J.; Semler, D. R.; Liu, J.; Bazin, G.; Zenteno, A.; Armstrong, R.; Bertin, E.; Allam, S. S.; Buckley-Geer, E. J.; Lin, H.; Tucker, D.; Barkhouse, W. A.; Cooper, M. C.; Hansen, S. M.; High, F. W.; Lin, Y.-T.; Ngeow, C.-C.; Rest, A.; Song, J.

    2012-09-20

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha}, {delta}) = (5 hr, -55 Degree-Sign ) and (23 hr, -55 Degree-Sign ). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4 m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out point-spread function-corrected model-fitting photometry for all detected objects. The median 10{sigma} galaxy (point-source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6), and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 mas. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from the Two Micron All Sky Survey, which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematic floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7%, and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier spread{sub m}odel produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta}z/(1 + z) = 0.054 with an outlier fraction {eta} < 5% to z {approx} 1. We highlight some selected science results to date and provide a full description of the released data products.

  6. The Blanco Cosmology Survey: Data Acquisition, Processing, Calibration, Quality Diagnostics and Data Release

    SciTech Connect

    Desai, S.; Armstrong, R.; Mohr, J.J.; Semler, D.R.; Liu, J.; Bertin, E.; Allam, S.S.; Barkhouse, W.A.; Bazin, G.; Buckley-Geer, E.J.; Cooper, M.C.; /UC, Irvine /Lick Observ. /UC, Santa Cruz

    2012-04-01

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha},{delta})= (5 hr, -55{sup circ} and 23 hr, -55{sup circ}). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out PSF corrected model fitting photometry for all detected objects. The median 10{sigma} galaxy (point source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6) and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 milli-arcsec. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from 2MASS which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematics floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7% and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta} z/(1+z)=0.054 with an outlier fraction {eta}<5% to z{approx}1. We highlight some selected science results to date and provide a full description of the released data products.

  7. Identification, isolation, and analysis of a gene cluster involved in iron acquisition by Pseudomonas mendocina ymp

    PubMed Central

    Awaya, Jonathan D.

    2013-01-01

    Microbial acquisition of iron from natural sources in aerobic environments is a little-studied process that may lead to mineral instability and trace metal mobilization. Pseudomonas mendocina ymp was isolated from the Yucca Mountain Site for long-term nuclear waste storage. Its ability to solubilize a variety of Fe-containing minerals under aerobic conditions has been previously investigated but its molecular and genetic potential remained uncharacterized. Here, we have shown that the organism produces a hydroxamate and not a catecholate-based siderophore that is synthesized via non-ribosomal peptide synthetases. Gene clustering patterns observed in other Pseudomonads suggested that hybridizing multiple probes to the same library could allow for the identification of one or more clusters of syntenic siderophore-associated genes. Using this approach, two independent clusters were identified. An unfinished draft genome sequence of P. mendocina ymp indicated that these mapped to two independent contigs. The sequenced clusters were investigated informatically and shown to contain respectively a potentially complete set of genes responsible for siderophore biosynthesis, uptake, and regulation, and an incomplete set of genes with low individual homology to siderophore-associated genes. A mutation in the cluster’s pvdA homolog (pmhA) resulted in a siderophore-null phenotype, which could be reversed by complementation. The organism likely produces one siderophore with possibly different isoforms and a peptide backbone structure containing seven residues (predicted sequence: Acyl-Asp-Dab-Ser-fOHOrn-Ser-fOHorn). A similar approach could be applied for discovery of Fe− and siderophore-associated genes in unsequenced or poorly annotated organisms. PMID:18058194

  8. Data Acquisition and Prompt Analysis System for High Altitude Balloon Experiments

    NASA Technical Reports Server (NTRS)

    Sarkady, A. A.; Chupp, E. L.; Dickey, J. W.

    1968-01-01

    An inexpensive and simple data acquisition system has been developed for balloon borne experiments and has been tested with a gamma ray detector in a balloon flight launched from Palestine, Texas. The detector used for the test consisted of an NaI(T1) scintillation crystal encased in a 1/8 in. plastic scintillator-charged particle shield. The combination was viewed by a single photomultiplier and charged particle gating was accomplished by a conventional phoswich discriminator. The pulse height analysis of the NaI events, not associated with prompt charged particle interactions, is accomplished by converting to a time spectrum using an airborne height to time converter. A range of pulse widths from 5 microseconds to 250 microseconds corresponds to energy losses in NaI from 100 to 1000 keV. The time spectrum information, along with charged particle events and barometric pressure, is fed to a mixer which modulates a 252.4 Mc FM transmitter. The original scintillator spectrum is recovered on the ground utilizing conversion circuitry at the receiver video output and a 128 channel commercial pulse height analyzer. The charged particle events of standard time width are stored with the spectrum at a fixed channel position and are therefore represented by a sharp line riding on the lower part of the NaI energy loss spectrum. An energy loss greater than 1000 keV is presented by the maximum pulse width of the converter and stored in the last analyzer channel. Barometric pressure data is transmitted by low frequency modulation of the sme FM carrier. In flight operation, the receiver video output can be recorded on a wide band tape recorder and simultaneously analyzed by the 128 channel analyzer, or the telemetered data can be analyzed later. The flight system features high pulse resolution, essentially instantaneous time response, high data rate, and flexibility; and is of modest cost. A detailed description of the system and operating performance is discussed.

  9. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  10. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  11. Untargeted, spectral library-free analysis of data-independent acquisition proteomics data generated using Orbitrap mass spectrometers.

    PubMed

    Tsou, Chih-Chiang; Tsai, Chia-Feng; Teo, Guo Ci; Chen, Yu-Ju; Nesvizhskii, Alexey I

    2016-08-01

    We describe an improved version of the data-independent acquisition (DIA) computational analysis tool DIA-Umpire, and show that it enables highly sensitive, untargeted, and direct (spectral library-free) analysis of DIA data obtained using the Orbitrap family of mass spectrometers. DIA-Umpire v2 implements an improved feature detection algorithm with two additional filters based on the isotope pattern and fractional peptide mass analysis. The targeted re-extraction step of DIA-Umpire is updated with an improved scoring function and a more robust, semiparametric mixture modeling of the resulting scores for computing posterior probabilities of correct peptide identification in a targeted setting. Using two publicly available Q Exactive DIA datasets generated using HEK-293 cells and human liver microtissues, we demonstrate that DIA-Umpire can identify similar number of peptide ions, but with better identification reproducibility between replicates and samples, as with conventional data-dependent acquisition. We further demonstrate the utility of DIA-Umpire using a series of Orbitrap Fusion DIA experiments with HeLa cell lysates profiled using conventional data-dependent acquisition and using DIA with different isolation window widths.

  12. Multislice perfusion of the kidneys using parallel imaging: image acquisition and analysis strategies.

    PubMed

    Gardener, Alexander G; Francis, Susan T

    2010-06-01

    Flow-sensitive alternating inversion recovery arterial spin labeling with parallel imaging acquisition is used to acquire single-shot, multislice perfusion maps of the kidney. A considerable problem for arterial spin labeling methods, which are based on sequential subtraction, is the movement of the kidneys due to respiratory motion between acquisitions. The effects of breathing strategy (free, respiratory-triggered and breath hold) are studied and the use of background suppression is investigated. The application of movement correction by image registration is assessed and perfusion rates are measured. Postacquisition image realignment is shown to improve visual quality and subsequent perfusion quantification. Using such correction, data can be collected from free breathing alone, without the need for a good respiratory trace and in the shortest overall acquisition time, advantageous for patient comfort. The addition of background suppression to arterial spin labeling data is shown to reduce the perfusion signal-to-noise ratio and underestimate perfusion.

  13. Performance analysis for the expanding search PN acquisition algorithm. [pseudonoise in spread spectrum transmission

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1982-01-01

    An approach is described for approximating the cumulative probability distribution of the acquisition time of the serial pseudonoise (PN) search algorithm. The results are applicable to both variable and fixed dwell time systems. The theory is developed for the case where some a priori information is available on the PN code epoch (reacquisition problem or acquisition of very long codes). Also considered is the special case of a search over the whole code. The accuracy of the approximation is demonstrated by comparisons with published exact results for the fixed dwell time algorithm.

  14. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  15. NMDA Receptor-Dependent Processes in the Medial Prefrontal Cortex Are Important for Acquisition and the Early Stage of Consolidation during Trace, but Not Delay Eyeblink Conditioning

    ERIC Educational Resources Information Center

    Takehara-Nishiuchi, Kaori; Kawahara, Shigenori; Kirino, Yutaka

    2005-01-01

    Permanent lesions in the medial prefrontal cortex (mPFC) affect acquisition of conditioned responses (CRs) during trace eyeblink conditioning and retention of remotely acquired CRs. To clarify further roles of the mPFC in this type of learning, we investigated the participation of the mPFC in mnemonic processes both during and after daily…

  16. Since When or How Often? Dissociating the Roles of Age of Acquisition (AoA) and Lexical Frequency in Early Visual Word Processing

    ERIC Educational Resources Information Center

    Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado

    2013-01-01

    The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…

  17. Variation in the application of natural processes: language-dependent constraints in the phonological acquisition of bilingual children.

    PubMed

    Faingold, E D

    1996-09-01

    This paper studies phonological processes and constraints on early phonological and lexical development, as well as the strategies employed by a young Spanish-, Portuguese-, and Hebrew-speaking child-Nurit (the author's niece)-in the construction of her early lexicon. Nurit's linguistic development is compared to that of another Spanish-, Portuguese-, and Hebrew-speaking child-Noam (the author's son). Noam and Nurit's linguistic development is contrasted to that of Berman's (1977) English- and Hebrew-speaking daughter (Shelli). The simultaneous acquisition of similar (closely related languages) such as Spanish and Portuguese versus that of nonrelated languages such as English and Hebrew yields different results: Children acquiring similar languages seem to prefer maintenance as a strategy for the construction of their early lexicon, while children exposed to nonrelated languages appear to prefer reduction to a large extent (Faingold, 1990). The Spanish- and Portuguese-speaking children's high accuracy stems from a wider choice of target words, where the diachronic development of two closely related languages provides a simplified model lexicon to the child. PMID:8865623

  18. Integrated Processing of High Resolution Topographic Data for Soil Erosion Assessment Considering Data Acquisition Schemes and Surface Properties

    NASA Astrophysics Data System (ADS)

    Eltner, A.; Schneider, D.; Maas, H.-G.

    2016-06-01

    Soil erosion is a decisive earth surface process strongly influencing the fertility of arable land. Several options exist to detect soil erosion at the scale of large field plots (here 600 m²), which comprise different advantages and disadvantages depending on the applied method. In this study, the benefits of unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are exploited to quantify soil surface changes. Beforehand data combination, TLS data is co-registered to the DEMs generated with UAV photogrammetry. TLS data is used to detect global as well as local errors in the DEMs calculated from UAV images. Additionally, TLS data is considered for vegetation filtering. Complimentary, DEMs from UAV photogrammetry are utilised to detect systematic TLS errors and to further filter TLS point clouds in regard to unfavourable scan geometry (i.e. incidence angle and footprint) on gentle hillslopes. In addition, surface roughness is integrated as an important parameter to evaluate TLS point reliability because of the increasing footprints and thus area of signal reflection with increasing distance to the scanning device. The developed fusion tool allows for the estimation of reliable data points from each data source, considering the data acquisition geometry and surface properties, to finally merge both data sets into a single soil surface model. Data fusion is performed for three different field campaigns at a Mediterranean field plot. Successive DEM evaluation reveals continuous decrease of soil surface roughness, reappearance of former wheel tracks and local soil particle relocation patterns.

  19. Vanadium tube processing and analysis

    SciTech Connect

    Kautz, D.D.; Tanaka, G.J.

    1993-08-11

    Vanadium tubing obtained from Century Tubes, a custom tubing manufacturer, was studied to determine as-received quality and fabricability. Applications for this tubing involve crimping and sealing operations at Pantex Plant requiring very high levels of leak-tightness (leak rates less than 10{sup {minus}8} atm-cc He/sec). The as-received material had poor OD and ID surface finish and cleanliness that needed to be improved before use in component fabrication. Savannah River Technical Center (SRTC) personnel developed a cleaning procedure to make this tubing acceptable for crimping and sealing operations. After suitably cleaning the tubing, we tested several tube sealing techniques and all showed some degree of success. Pantex Plant personnel are now implementing a tube sealing process very similar to one of the techniques studied, a mechanical crimp followed by seal welding.

  20. Seismic acquisition parameters analysis for deep weak reflectors in the South Yellow Sea

    NASA Astrophysics Data System (ADS)

    Liu, Kai; Liu, Huaishan; Wu, Zhiqiang; Yue, Long

    2016-10-01

    The Mesozoic-Paleozoic marine residual basin in the South Yellow Sea (SYS) is a significant deep potential hydrocarbon reservoir. However, the imaging of the deep prospecting target is quite challenging due to the specific seismic-geological conditions. In the Central and Wunansha Uplifts, the penetration of the seismic wavefield is limited by the shallow high-velocity layers (HVLs) and the weak reflections in the deep carbonate rocks. With the conventional marine seismic acquisition technique, the deep weak reflection is difficult to image and identify. In this paper, we could confirm through numerical simulation that the combination of multi-level air-gun array and extended cable used in the seismic acquisition is crucial for improving the imaging quality. Based on the velocity model derived from the geological interpretation, we performed two-dimensional finite difference forward modeling. The numerical simulation results show that the use of the multi-level air-gun array can enhance low-frequency energy and that the wide-angle reflection received at far offsets of the extended cable has a higher signal-to-noise ratio (SNR) and higher energy. Therefore, we have demonstrated that the unconventional wide-angle seismic acquisition technique mentioned above could overcome the difficulty in imaging the deep weak reflectors of the SYS, and it may be useful for the design of practical seismic acquisition schemes in this region.

  1. Some Thoughts on the Contrastive Analysis of Features in Second Language Acquisition

    ERIC Educational Resources Information Center

    Lardiere, Donna

    2009-01-01

    In this article I discuss the selection and assembly of formal features in second language acquisition. Assembling the particular lexical items of a second language (L2) requires that the learner reconfigure features from the way these are represented in the first language (L1) into new formal configurations on possibly quite different types of…

  2. Morphological Awareness in Literacy Acquisition of Chinese Second Graders: A Path Analysis

    ERIC Educational Resources Information Center

    Zhang, Haomin

    2016-01-01

    The present study tested a path diagram regarding the contribution of morphological awareness (MA) to early literacy acquisition among Chinese-speaking second graders (N = 123). Three facets of MA were addressed, namely derivational awareness, compound awareness and compound structure awareness. The model aimed to test a theory of causal order…

  3. Border Crossings? Exploring the Intersection of Second Language Acquisition, Conversation Analysis, and Foreign Language Pedagogy

    ERIC Educational Resources Information Center

    Mori, Junko

    2007-01-01

    This article explores recent changes in the landscape of second language acquisition (SLA) and foreign language pedagogical (FLP) research. Firth and Wagner's (1997) proposal for the reconceptualization of SLA has been supported by SLA and FLP researchers who share the sentiment concerning the need for increased attention to social and contextual…

  4. Analysis of the Effect a Student-Centred Mobile Learning Instructional Method Has on Language Acquisition

    ERIC Educational Resources Information Center

    Oberg, Andrew; Daniels, Paul

    2013-01-01

    In this study a self-paced instructional method based on the use of Apple's iPod Touch personal mobile devices to deliver content was compared with a group-oriented instructional method of content delivery in terms of learner acquisition of course material. One hundred and twenty-two first-year Japanese university students in four classes were…

  5. Improving Data Analysis in Second Language Acquisition by Utilizing Modern Developments in Applied Statistics

    ERIC Educational Resources Information Center

    Larson-Hall, Jenifer; Herrington, Richard

    2010-01-01

    In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…

  6. EMOTIONS AND IMAGES IN LANGUAGE--A LEARNING ANALYSIS OF THEIR ACQUISITION AND FUNCTION.

    ERIC Educational Resources Information Center

    STAATS, ARTHUR W.

    THIS ARTICLE PRESENTED THEORETICAL AND EXPERIMENTAL ANALYSES CONCERNING IMPORTANT ASPECTS OF LANGUAGE. IT WAS SUGGESTED THAT A LEARNING THEORY WHICH INEGRATES INSTRUMENTAL AND CLASSICAL CONDITIONING, CUTTING ACROSS THEORETICAL LINES, COULD SERVE AS THE BASIS FOR A COMPREHENSIVE THEORY OF LANGUAGE ACQUISITION AND FUNCTION. THE PAPER ILLUSTRATED THE…

  7. The information transfer and knowledge acquisition geographies of family caregivers: an analysis of Canada's Compassionate Care Benefit.

    PubMed

    Crooks, Valorie A; Williams, Allison; Stajduhar, Kelli I; Allan, Diane E; Cohen, S Robin

    2007-09-01

    The authors explore an underdeveloped area of health geography by examining information transfer and knowledge acquisition for a health-related social program. Specifically, they discuss the findings of a small-scale utilization-focused evaluation of Canada's Compassionate Care Benefit (CCB). The CCB allows workers who are eligible for employment insurance to leave work to care for family members at end-of-life. Using the findings of 25 interviews with family caregivers, the authors explore their geographies of information transfer and knowledge acquisition. First, however, they introduce their respondent group and provide an overview of their socio-spatial lives as family caregivers. They then examine 3 specific thematic findings: awareness of the CCB, access to information related to the CCB, and the application process. The authors discuss the implications of the findings for the information needs and burdens of family caregivers and for Canadian nursing practice. They also consider directions for future CCB research. PMID:17970459

  8. Analysis of factors determining enterprise value of company merger and acquisition: A case study of coal in Kalimantan, Indonesia

    NASA Astrophysics Data System (ADS)

    Candra, Ade; Pasasa, Linus A.; Simatupang, Parhimpunan

    2015-09-01

    The main purpose of this paper is looking at the relationship between the factors of technical, financial and legal with enterprise value in mergers and acquisitions of coal companies in Kalimantan, Indonesia over the last 10 years. Data obtained from secondary data sources in the company works and from published data on the internet. The data thus obtained are as many as 46 secondary data with parameters resources, reserves, stripping ratio, calorific value, distance from pit to port, and distance from ports to vessels, production per annum, the cost from pit to port, from port to vessel costs, royalties, coal price and permit status. The data was analysis using structural equation modeling (SEM) to determine the factors that most significant influence enterprise value of coal company in Kalimantan. The result shows that a technical matter is the factor that most affects the value of enterprise in coal merger and acquisition company. Financial aspect is the second factor that affects the enterprise value.

  9. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions

    PubMed Central

    Shenoy, Shailesh M.

    2016-01-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software’s support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity. PMID:27516727

  10. Enhanced Data-Acquisition System

    NASA Technical Reports Server (NTRS)

    Mustain, Roy W.

    1990-01-01

    Time-consuming, costly digitization of analog signals on magnetic tape eliminated. Proposed data-acquisition system provides nearly immediate access to data in incoming signals by digitizing and recording them both on magnetic tape and on optical disk. Tape and/or disk later played back to reconstruct signals in analog or digital form for analysis. Of interest in industrial and scientific applications in which necessary to digitize, store, and/or process large quantities of experimental data.

  11. Data acquisition and analysis in the DOE/NASA Wind Energy Program

    NASA Technical Reports Server (NTRS)

    Neustadter, H. E.

    1980-01-01

    Four categories of data systems, each responding to a distinct information need are presented. The categories are: control, technology, engineering and performance. The focus is on the technology data system which consists of the following elements: sensors which measure critical parameters such as wind speed and direction, output power, blade loads and strains, and tower vibrations; remote multiplexing units (RMU) mounted on each wind turbine which frequency modulate, multiplex and transmit sensor outputs; the instrumentation available to record, process and display these signals; and centralized computer analysis of data. The RMU characteristics and multiplexing techniques are presented. Data processing is illustrated by following a typical signal through instruments such as the analog tape recorder, analog to digital converter, data compressor, digital tape recorder, video (CRT) display, and strip chart recorder.

  12. Data Processing and Analysis Systems for JT-60U

    SciTech Connect

    Matsuda, T.; Totsuka, T.; Tsugita, T.; Oshima, T.; Sakata, S.; Sato, M.; Iwasaki, K.

    2002-09-15

    The JT-60U data processing system is a large computer complex gradually modernized by utilizing progressive computer and network technology. A main computer using state-of-the-art CMOS technology can handle {approx}550 MB of data per discharge. A gigabit ethernet switch with FDDI ports has been introduced to cope with the increase of handling data. Workstation systems with VMEbus serial highway drivers for CAMAC have been developed and used to replace many minicomputer systems. VMEbus-based fast data acquisition systems have also been developed to enlarge and replace a minicomputer system for mass data.The JT-60U data analysis system is composed of a JT-60U database server and a JT-60U analysis server, which are distributed UNIX servers. The experimental database is stored in the 1TB RAID disk of the JT-60U database server and is composed of ZENKEI and diagnostic databases. Various data analysis tools are available on the JT-60U analysis server. For the remote collaboration, technical features of the data analysis system have been applied to the computer system to access JT-60U data via the Internet. Remote participation in JT-60U experiments has been successfully conducted since 1996.

  13. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  14. A cross-level units-of-analysis approach to individual differences in skill acquisition.

    PubMed

    Eyring, J D; Johnson, D S; Francis, D J

    1993-10-01

    A recent multiple-stage model posits that the individual-difference factors influencing performance vary depending on skill acquisition stage (P. L. Ackerman, 1989, 1990). In the current study, the authors examine the effect of ability in early skill acquisition and extend earlier research by examining the roles of self-efficacy and task familiarity. Furthermore, learning-curve modeling with multilevel models is used to alleviate prior analytical problems. Subjects (N = 115) performed an air traffic control simulation task. Nonlinear learning-curve parameters were estimated for each subject using a negative exponential model (see D. R. Rogosa & J. B. Willett, 1985). Cognitive ability, self-efficacy, and task familiarity were then used to predict learning-curve parameters: learning-rate constant and asymptotic performance. Results revealed that ability, self-efficacy, and familiarity predicted the learning-rate constant, whereas self-efficacy predicted asymptotic performance.

  15. CAPTAN: A hardware architecture for integrated data acquisition, control, and analysis for detector development

    SciTech Connect

    Turqueti, Marcos; Rivera, Ryan A.; Prosser, Alan; Andresen, Jeffry; Chramowicz, John; /Fermilab

    2008-11-01

    The Electronic Systems Engineering Department of the Computing Division at the Fermi National Accelerator Laboratory has developed a data acquisition system flexible and powerful enough to meet the needs of a variety of high energy physics applications. The system described in this paper is called CAPTAN (Compact And Programmable daTa Acquisition Node) and its architecture and capabilities are presented in detail here. The three most important characteristics of this system are flexibility, versatility and scalability. These three main features are supported by key architectural features; a vertical bus that permits the user to stack multiple boards, a gigabit Ethernet link that permits high speed communications to the system and the core group of boards that provide specific capabilities for the system. In this paper, we describe the system architecture, give an overview of its capabilities and point out possible applications.

  16. Empirical Analysis of Effects of Bank Mergers and Acquisitions on Small Business Lending in Nigeria

    NASA Astrophysics Data System (ADS)

    Ita, Asuquo Akabom

    2012-11-01

    Mergers and acquisitions are the major instruments of the recent banking reforms in Nigeria.The effects and the implications of the reforms on the lending practices of merged banks to small businesses were considered in this study. These effects were divided into static and dynamic effects (restructuring, direct and external). Data were collected by cross-sectional research design and were subsequently analyzed by the ordinary least square (OLS) method.The analyses show that bank size, financial characteristics and deposit of non-merged banks are positively related to small business lending. While for the merged banks, the reverse is the case. From the above result, it is evident that merger and acquisition have not only static effect on small business lending but also dynamic effect, therefore, given the central position of small businesses in the current government policy on industrialization in Nigeria, policy makers in Nigeria, should consider both the static and dynamic effects of merger and acquisition on small business lending in their policy thrust.

  17. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  18. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  19. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  20. Vygotsky's Analysis of Children's Meaning Making Processes

    ERIC Educational Resources Information Center

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  1. Encapsulation Processing and Manufacturing Yield Analysis

    NASA Technical Reports Server (NTRS)

    Willis, P. B.

    1984-01-01

    The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.

  2. Analysis of physical processes via imaging vectors

    NASA Astrophysics Data System (ADS)

    Volovodenko, V.; Efremova, N.; Efremov, V.

    2016-06-01

    Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

  3. Determinants of premiums in aerospace mergers and acquisitions: A preliminary analysis

    NASA Astrophysics Data System (ADS)

    Bryant, John K.

    There is a large body of literature on different aspects of premiums as they relate to mergers and acquisitions. However, there is very little literature that specifically discusses the determinants of premiums in aerospace. Few industries have experienced the prolonged consolidation that the aerospace industry has seen. Today, the industry is dominated by a few large firms, but there is still merger activity continuing especially with second-tier firms attempting to secure their future through growth. This paper examines several determinants as applied to 18 aerospace mergers of publicly held companies and divisions from 1991 through April of 2002.

  4. Complete data acquisition and analysis system for low-energy electron-molecule collision studies

    NASA Astrophysics Data System (ADS)

    Nag, Pamir; Nandi, Dhananjay

    2015-09-01

    A complete data acquisition system has been developed that can work with any personal computer irrespective of the operating system installed on it. The software can be used in low and intermediate electron-energy collision studies with ground-state molecules in gas phase using a combination of RS-232, GPIB, and USB-interfaced devices. Various tabletop instruments and nuclear instrumentation module (NIM) -based electronics have been interfaced and have communicated with the software, which is based on LabVIEW. This is tested with dissociative electron attachment (DEA) and polar dissociation studies to oxygen molecule and successfully used in a DEA study of carbon monoxide and carbon dioxide.

  5. CPAS Preflight Drop Test Analysis Process

    NASA Technical Reports Server (NTRS)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  6. The Effect of Age of Second Language Acquisition on the Representation and Processing of Second Language Words

    ERIC Educational Resources Information Center

    Silverberg, Stu; Samuel, Arthur G.

    2004-01-01

    In this study, the effects of second language (i.e., L2) proficiency and age of second language acquisition are assessed. Three types of bilinguals are compared: Early L2 learners, Late highly proficient L2 learners, and Late less proficient L2 learners. A lexical decision priming paradigm is used in which the critical trials consist of first…

  7. The Symbolic World of the Bilingual Child: Digressions on Language Acquisition, Culture and the Process of Thinking

    ERIC Educational Resources Information Center

    Nowak-Fabrykowski, Krystyna; Shkandrij, Miroslav

    2004-01-01

    In this paper we explore the relationship between language acquisition, and the construction of a symbolic world. According to Bowers (1989) language is a collection of patterns regulating social life. This conception is close to that of Symbolic Interactionists (Charon, 1989) who see society as made up of interacting individuals who are symbol…

  8. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  9. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  10. SU-C-18C-06: Radiation Dose Reduction in Body Interventional Radiology: Clinical Results Utilizing a New Imaging Acquisition and Processing Platform

    SciTech Connect

    Kohlbrenner, R; Kolli, KP; Taylor, A; Kohi, M; Fidelman, N; LaBerge, J; Kerlan, R; Gould, R

    2014-06-01

    Purpose: To quantify the patient radiation dose reduction achieved during transarterial chemoembolization (TACE) procedures performed in a body interventional radiology suite equipped with the Philips Allura Clarity imaging acquisition and processing platform, compared to TACE procedures performed in the same suite equipped with the Philips Allura Xper platform. Methods: Total fluoroscopy time, cumulative dose area product, and cumulative air kerma were recorded for the first 25 TACE procedures performed to treat hepatocellular carcinoma (HCC) in a Philips body interventional radiology suite equipped with Philips Allura Clarity. The same data were collected for the prior 85 TACE procedures performed to treat HCC in the same suite equipped with Philips Allura Xper. Mean values from these cohorts were compared using two-tailed t tests. Results: Following installation of the Philips Allura Clarity platform, a 42.8% reduction in mean cumulative dose area product (3033.2 versus 1733.6 mGycm∧2, p < 0.0001) and a 31.2% reduction in mean cumulative air kerma (1445.4 versus 994.2 mGy, p < 0.001) was achieved compared to similar procedures performed in the same suite equipped with the Philips Allura Xper platform. Mean total fluoroscopy time was not significantly different between the two cohorts (1679.3 versus 1791.3 seconds, p = 0.41). Conclusion: This study demonstrates a significant patient radiation dose reduction during TACE procedures performed to treat HCC after a body interventional radiology suite was converted to the Philips Allura Clarity platform from the Philips Allura Xper platform. Future work will focus on evaluation of patient dose reduction in a larger cohort of patients across a broader range of procedures and in specific populations, including obese patients and pediatric patients, and comparison of image quality between the two platforms. Funding for this study was provided by Philips Healthcare, with 5% salary support provided to authors K. Pallav

  11. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  12. Steam Generator Group Project. Progress report on data acquisition/statistical analysis

    SciTech Connect

    Doctor, P.G.; Buchanan, J.A.; McIntyre, J.M.; Hof, P.J.; Ercanbrack, S.S.

    1984-01-01

    A major task of the Steam Generator Group Project (SGGP) is to establish the reliability of the eddy current inservice inspections of PWR steam generator tubing, by comparing the eddy current data to the actual physical condition of the tubes via destructive analyses. This report describes the plans for the computer systems needed to acquire, store and analyze the diverse data to be collected during the project. The real-time acquisition of the baseline eddy current inspection data will be handled using a specially designed data acquisition computer system based on a Digital Equipment Corporation (DEC) PDP-11/44. The data will be archived in digital form for use after the project is completed. Data base management and statistical analyses will be done on a DEC VAX-11/780. Color graphics will be heavily used to summarize the data and the results of the analyses. The report describes the data that will be taken during the project and the statistical methods that will be used to analyze the data. 7 figures, 2 tables.

  13. Observation and analysis of lunar occultations of stars with an emphasis on improvements to data acquisition instrumentation and reduction techniques

    SciTech Connect

    Schneider, G.H.

    1985-01-01

    A program of observation and analysis of lunar occultations was conceived, developed, and carried out using the facilities of the University of Florida's Rosemary Hill Observatory (RHO). The successful implementation of the program required investigation into several related areas. First, after an upgrade to the RHO 76-cm. reflecting telescope, a microprocessor controlled fast photoelectric data acquisition system was designed and built for the occultation data acquisition task. Second, the currently available model-fitting techniques used in the analysis of occultation observations were evaluated. A number of numerical experiments on synthesized and observational data were carried out to improve the performance of the numerical techniques. Among the numerical methods investigated were solution schemes employing partial parametric adjustment, parametric grouping into computational subsets (randomly and on the basis the correlation coefficients), and preprocessing of the observational data by a number of smoothing techniques for a variety of noise conditions. Third, a turn-key computational software system, incorporating data transfer, reduction, graphics, and display, was developed to carry out all the necessary and related computational tasks in an interactive environment. Twenty-four occultation observations were obtained during the period March 1983 to March 1984.

  14. Selecting the optimal anti-aliasing filter for multichannel biosignal acquisition intended for inter-signal phase shift analysis.

    PubMed

    Keresnyei, Róbert; Megyeri, Péter; Zidarics, Zoltán; Hejjel, László

    2015-01-01

    The availability of microcomputer-based portable devices facilitates the high-volume multichannel biosignal acquisition and the analysis of their instantaneous oscillations and inter-signal temporal correlations. These new, non-invasively obtained parameters can have considerable prognostic or diagnostic roles. The present study investigates the inherent signal delay of the obligatory anti-aliasing filters. One cycle of each of the 8 electrocardiogram (ECG) and 4 photoplethysmogram signals from healthy volunteers or artificially synthesised series were passed through 100-80-60-40-20 Hz 2-4-6-8th order Bessel and Butterworth filters digitally synthesized by bilinear transformation, that resulted in a negligible error in signal delay compared to the mathematical model of the impulse- and step responses of the filters. The investigated filters have as diverse a signal delay as 2-46 ms depending on the filter parameters and the signal slew rate, which is difficult to predict in biological systems and thus difficult to compensate for. Its magnitude can be comparable to the examined phase shifts, deteriorating the accuracy of the measurement. As a conclusion, identical or very similar anti-aliasing filters with lower orders and higher corner frequencies, oversampling, and digital low pass filtering are recommended for biosignal acquisition intended for inter-signal phase shift analysis. PMID:25514627

  15. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  16. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  17. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges.

    PubMed

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  18. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods.

    PubMed

    Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P

    2012-06-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.

  19. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  20. A User's Guide to Target Acquisition with STIS (Revision B)

    NASA Astrophysics Data System (ADS)

    Downes, R.; Clampin, M.; Shaw, R.; Baum, S.; Kinney, E.; McGrath, M.

    1997-05-01

    The STIS Instrument Handbook (Chapter 8) presented an overview of target acquisition which represented our best knowledge on the workings and requirements of the STIS acquisition flight software at that time (June 1996). The majority of the information presented in the Handbook remains valid, although there have been some changes to how the acquisition data will be obtained and processed. We have also finalized the Phase II specifications for STIS target acquisition since the Handbook. In order to provide users with the updated information, and to put all the necessary information on STIS target acquisition in one place, we have generated this User's Guide. Note that we anticipate updates to this document based on SMOV results, and these will be communicated via the Space Telescope Analysis Newsletter (STAN) for inclusion during the Phase B (science optimization) processing of your program.

  1. Sustainability Analysis for Products and Processes

    EPA Science Inventory

    Sustainability Analysis for Products and Processes Subhas K. Sikdar National Risk Management Research Laboratory United States Environmental protection Agency 26 W. M.L. King Dr. Cincinnati, OH 45237 Sikdar.subhas@epa.gov ABSTRACT Claims of both sustainable and unsu...

  2. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  3. Interactive Fringe processing algorithm for interferogram analysis

    NASA Astrophysics Data System (ADS)

    Parthiban, V.; Sirohi, Rajpal S.

    A highly flexible algorithm for interferogram processing which enables the operator to interact with the computer at every stage, is presented. This algorithm developed on a PDP 11/23 microcomputer, uses Fortran callable subroutines based on Intellect 100 image processing hardware and a CUB R-G-B monitor. It also uses a single frame buffer of 512 x 512 x 8 pixels. This software employs a pseudo-colour mapping technique which helps the operator to select the optimum threshold values. Manual editing of the processed fringe pattern is also possible to enable removal of unwanted kinks and to connect any discontinuities. A fringe scanning subroutine is used to number the fringes and to store the peak coordinates in a data file for fringe analysis. The algorithm is employed for the analysis of an interferogram obtained from an inverting interferometer and the results are presented.

  4. Exergy analysis of nutrient recovery processes.

    PubMed

    Hellström, D

    2003-01-01

    In an exergy analysis, the actual consumption of resources in physical and chemical processes is calculated. Energy and chemical elements are not consumed in the processes--they are only transformed into other forms with lower quality. The principals of exergy analysis are illustrated by comparing different wastewater treatment systems for nutrient recovery. One system represents an end-of-pipe structure, whereas other systems include source separation of grey water, black water, and urine. The exergy flows analysed in this paper are those related to management and treatment of organic matter and nutrients. The study shows that the total exergy consumption is lowest for the system with source separation of urine and faeces and greatest for the conventional wastewater treatment system complemented by processes for nutrient recovery.

  5. Parallel processing in finite element structural analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1987-01-01

    A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).

  6. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  7. Impact of Personal Relevance on Acquisition and Generalization of Script Training for Aphasia: A Preliminary Analysis

    PubMed Central

    Kaye, Rosalind C.; Lee, Jaime B.; van Vuuren, Sarel

    2015-01-01

    Purpose The importance of personalization in script training in aphasia has been assumed but never tested. This study compared acquisition and generalization of personally relevant versus generic words or phrases appearing in the same scripts. Method Eight individuals (6 men; 2 women) with chronic aphasia received 3 weeks of intensive computer-based script training. For each participant, 2 scripts, a trained and an untrained generalization script, were embedded with 4 personally relevant word choices and 2–4 generic items that were similar across participants. Scripts were probed for accuracy at baseline and posttreatment. Significance testing was conducted on baseline and posttreatment scores, and for gains in personally relevant versus generic items. Effect sizes were computed. Results Both personally relevant and generic items improved significantly on trained scripts. Improvements on untrained scripts were smaller, with only personally relevant items reaching significance. There was no significant difference on gains made on personalized versus generic items for trained scripts (p = .059), but the effect size was large (d = 0.90). For generalization scripts, this effect was small (d = 0.25) and nonsignificant. Conclusions Personally relevant words and phrases were acquired, although not generalized, more successfully than generic words and phrases. Data supports the importance of personalization in script training, but the degree of that importance requires further investigation. PMID:26340806

  8. Uncertainty-Based Approach for Dynamic Aerodynamic Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Heim, Eugene H. D.; Bandon, Jay M.

    2004-01-01

    Development of improved modeling methods to provide increased fidelity of flight predictions for aircraft motions during flight in flow regimes with large nonlinearities requires improvements in test techniques for measuring and characterizing wind tunnel data. This paper presents a method for providing a measure of data integrity for static and forced oscillation test techniques. Data integrity is particularly important when attempting to accurately model and predict flight of today s high performance aircraft which are operating in expanded flight envelopes, often maneuvering at high angular rates at high angles-of-attack, even above maximum lift. Current aerodynamic models are inadequate in predicting flight characteristics in the expanded envelope, such as rapid aircraft departures and other unusual motions. Present wind tunnel test methods do not factor changes of flow physics into data acquisition schemes, so in many cases data are obtained over more iterations than required, or insufficient data may be obtained to determine a valid estimate with statistical significance. Additionally, forced oscillation test techniques, one of the primary tools used to develop dynamic models, do not currently provide estimates of the uncertainty of the results during an oscillation cycle. A method to optimize the required number of forced oscillation cycles based on decay of uncertainty gradients and balance tolerances is also presented.

  9. Integrating microprobe laboratory automation with high speed data acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Moloney, G. R.; O'Brien, P. M.; Saint, A.; Witham, L.; Sakalleriou, A.; Bettiol, A.; Legge, G. J. F.

    1995-09-01

    We are developing a laboratory and beam line control system. The MpControl system utilises all the flexibility and power of a computer network. Target stages, power supplies, Faraday cups, beam monitors, event counters, etc. can all be controlled by any of a number of computers in the laboratory. Software on each of these computers allows control, monitoring and display of the state of the beam line, accelerator and target manipulation stage. The entire system may be simultaneously controlled from any computer terminal on the network. The system has the potential to allow a user to manipulate the target stage from the beam control room, or to adjust accelerator parameters from the target chamber at the end of the beam line. The system has been designed to be easily transportable across computer platforms, currently with support for UNIX, X-Windows and MS-DOS. We believe this is a critical factor in a world of rapidly advancing computer and instrumentation hardware systems. The system has been designed to integrate with the MpSys data acquisition system.

  10. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  11. A pathway analysis of global aerosol processes

    NASA Astrophysics Data System (ADS)

    Schutgens, N. A. J.; Stier, P.

    2014-06-01

    We present a detailed budget of the changes in atmospheric aerosol mass and numbers due to various processes: emission, nucleation, coagulation, H2SO4 condensation and in-cloud production, ageing and deposition. The budget is created from monthly-averaged tracer tendencies calculated by the global aerosol model ECHAM5.5-HAM2 and allows us to investigate process contributions at various length- and time-scales. As a result, we show in unprecedented detail what processes drive the evolution of aerosol. In particular, we show that the processes that affect aerosol masses are quite different from those affecting aerosol numbers. Condensation of H2SO4 gas onto pre-existing particles is an important process, dominating the growth of small particles in the nucleation mode to the Aitken mode and the ageing of hydrophobic matter. Together with in-cloud production of H2SO4, it significantly contributes to (and often dominates) the mass burden (and hence composition) of the hydrophilic Aitken and accumulation mode particles. Particle growth itself is the leading source of number densities in the hydrophilic Aitken and accumulation modes, with their hydrophobic counterparts contributing (even locally) relatively little. As expected, the coarse mode is dominated by primary emissions and mostly decoupled from the smaller modes. Our analysis also suggests that coagulation serves mainly as a loss process for number densities and that, relative to other processes, it is a rather unimportant contributor to composition changes of aerosol. The analysis is extended with sensitivity studies where the impact of a lower model resolution or pre-industrial emissions is shown to be small. We discuss the use of the current budget for model simplification, prioritisation of model improvements, identification of potential structural model errors and model evaluation against observations.

  12. A pathway analysis of global aerosol processes

    NASA Astrophysics Data System (ADS)

    Schutgens, N. A. J.; Stier, P.

    2014-11-01

    We present a detailed budget of the changes in atmospheric aerosol mass and numbers due to various processes: emission (including instant condensation of soluble biogenic emissions), nucleation, coagulation, H2SO4 condensation and in-cloud production, aging and deposition. The budget is created from monthly averaged tracer tendencies calculated by the global aerosol model ECHAM5.5-HAM2 and allows us to investigate process contributions at various length-scales and timescales. As a result, we show in unprecedented detail what processes drive the evolution of aerosol. In particular, we show that the processes that affect aerosol masses are quite different from those that affect aerosol numbers. Condensation of H2SO4 gas onto pre-existing particles is an important process, dominating the growth of small particles in the nucleation mode to the Aitken mode and the aging of hydrophobic matter. Together with in-cloud production of H2SO4, it significantly contributes to (and often dominates) the mass burden (and hence composition) of the hydrophilic Aitken and accumulation mode particles. Particle growth itself is the leading source of number densities in the hydrophilic Aitken and accumulation modes, with their hydrophobic counterparts contributing (even locally) relatively little. As expected, the coarse mode is dominated by primary emissions and mostly decoupled from the smaller modes. Our analysis also suggests that coagulation serves mainly as a loss process for number densities and that, relative to other processes, it is a rather unimportant contributor to composition changes of aerosol. The analysis is extended with sensitivity studies where the impact of a lower model resolution or pre-industrial emissions is shown to be small. We discuss the use of the current budget for model simplification, prioritization of model improvements, identification of potential structural model errors and model evaluation against observations.

  13. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  14. Brief Experimental Analysis of Sight Word Interventions: A Comparison of Acquisition and Maintenance of Detected Interventions

    ERIC Educational Resources Information Center

    Baranek, Amy; Fienup, Daniel M.; Pace, Gary

    2011-01-01

    The purpose of this study was to examine utility of a brief experimental analysis (BEA) in determining effective sight word interventions for a student with a history of difficulty with acquiring sight word recognition. Ten interventions were compared in a BEA. Following the BEA, an extended analysis was conducted that compared the two most…

  15. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  16. An Experimental Analysis of Memory Processing

    ERIC Educational Resources Information Center

    Wright, Anthony A.

    2007-01-01

    Rhesus monkeys were trained and tested in visual and auditory list-memory tasks with sequences of four travel pictures or four natural/environmental sounds followed by single test items. Acquisitions of the visual list-memory task are presented. Visual recency (last item) memory diminished with retention delay, and primacy (first item) memory…

  17. Physical Data Acquisition and Analysis of Possible Flying Extraterrestrial Probes by using Opto-Electronic Devices

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    2000-02-01

    A technical research project regarding the search for evidence of the extraterrestrial origin of UFO phenomena is proposed. After showing the main results from the analysis of an earlier Norwegian instrumental project, specific monitoring techniques and strategies based on magnetometers, radio spectrum analyzers and radar-assisted sensors for the detection and analysis of UFO optical and infrared light are presented together with calculations of exposure times for optical observations. Physical parameters which are expected to be determinable from subsequent data analysis are described in detail. Finally, crucial tests in order to prove or confute a non-natural origin of the UFO phenomenon are proposed and discussed.

  18. Particle size determination using TEM: a discussion of image acquisition and analysis for the novice microscopist.

    PubMed

    Pyrz, William D; Buttrey, Douglas J

    2008-10-21

    As nanoparticle synthesis capabilities advance, there is an increasing need for reliable nanoparticle size distribution analysis. Transmission electron microscopy (TEM) can be used to directly image nanoparticles at scales approaching a single atom. However, the advantage gained by being able to "see" these nanoparticles comes with several tradeoffs that must be addressed and balanced. For effective nanoparticle characterization, the proper selection of imaging type (bright vs dark field), magnification, and analysis method (manual vs automated) is critical. These decisions control the measurement resolution, the contrast between the particle and background, the number of particles in each image, the subsequent analysis efficiency, and the proper determination of the particle-background boundary and affect the significance of electron beam damage to the sample. In this work, the relationship between the critical decisions required for TEM analysis of small nanoparticles and the statistical effects of these factors on the resulting size distribution is presented.

  19. A high speed data acquisition system for the analysis of velocity, density, and total temperature fluctuations at transonic speeds

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.; Jones, Gregory S.; Stainback, P. Calvin

    1988-01-01

    The use of a high-speed Dynamic Data Acquisition System (DDAS) to measure simultaneously velocity, density, and total temperature fluctuations is described. The DDAS is used to automate the acquisition of hot-wire calibration data. The data acquisition, data handling, and data reporting techiques used by DDAS are described. Sample data are used to compare results obtained with the DDAS with those obtained from the FM tape and post-test digitization method.

  20. Iodine-filter-based mobile Doppler lidar to make continuous and full-azimuth-scanned wind measurements: data acquisition and analysis system, data retrieval methods, and error analysis.

    PubMed

    Wang, Zhangjun; Liu, Zhishen; Liu, Liping; Wu, Songhua; Liu, Bingyi; Li, Zhigang; Chu, Xinzhao

    2010-12-20

    An incoherent Doppler wind lidar based on iodine edge filters has been developed at the Ocean University of China for remote measurements of atmospheric wind fields. The lidar is compact enough to fit in a minivan for mobile deployment. With its sophisticated and user-friendly data acquisition and analysis system (DAAS), this lidar has made a variety of line-of-sight (LOS) wind measurements in different operational modes. Through carefully developed data retrieval procedures, various wind products are provided by the lidar, including wind profile, LOS wind velocities in plan position indicator (PPI) and range height indicator (RHI) modes, and sea surface wind. Data are processed and displayed in real time, and continuous wind measurements have been demonstrated for as many as 16 days. Full-azimuth-scanned wind measurements in PPI mode and full-elevation-scanned wind measurements in RHI mode have been achieved with this lidar. The detection range of LOS wind velocity PPI and RHI reaches 8-10 km at night and 6-8 km during daytime with range resolution of 10 m and temporal resolution of 3 min. In this paper, we introduce the DAAS architecture and describe the data retrieval methods for various operation modes. We present the measurement procedures and results of LOS wind velocities in PPI and RHI scans along with wind profiles obtained by Doppler beam swing. The sea surface wind measured for the sailing competition during the 2008 Beijing Olympics is also presented. The precision and accuracy of wind measurements are estimated through analysis of the random errors associated with photon noise and the systematic errors introduced by the assumptions made in data retrieval. The three assumptions of horizontal homogeneity of atmosphere, close-to-zero vertical wind, and uniform sensitivity are made in order to experimentally determine the zero wind ratio and the measurement sensitivity, which are important factors in LOS wind retrieval. Deviations may occur under certain

  1. Acquisition strategies

    SciTech Connect

    Zimmer, M.J.; Lynch, P.W. )

    1993-11-01

    Acquiring projects takes careful planning, research and consideration. Picking the right opportunities and avoiding the pitfalls will lead to a more valuable portfolio. This article describes the steps to take in evaluating an acquisition and what items need to be considered in an evaluation.

  2. Automated analysis for lifecycle assembly processes

    SciTech Connect

    Calton, T.L.; Brown, R.G.; Peters, R.R.

    1998-05-01

    Many manufacturing companies today expend more effort on upgrade and disposal projects than on clean-slate design, and this trend is expected to become more prevalent in coming years. However, commercial CAD tools are better suited to initial product design than to the product`s full life cycle. Computer-aided analysis, optimization, and visualization of life cycle assembly processes based on the product CAD data can help ensure accuracy and reduce effort expended in planning these processes for existing products, as well as provide design-for-lifecycle analysis for new designs. To be effective, computer aided assembly planning systems must allow users to express the plan selection criteria that apply to their companies and products as well as to the life cycles of their products. Designing products for easy assembly and disassembly during its entire life cycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and constraints (compared to initial assembly) require one to re-visit the significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or applied studies of life cycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for; optimize, and analyze life cycle assembly processes.

  3. Human movement analysis with image processing in real time

    NASA Astrophysics Data System (ADS)

    Fauvet, Eric; Paindavoine, Michel; Cannard, F.

    1991-04-01

    In the field of the human sciences, a lot of applications needs to know the kinematic characteristics of the human movements Psycology is associating the characteristics with the control mechanism, sport and biomechariics are associating them with the performance of the sportman or of the patient. So the trainers or the doctors can correct the gesture of the subject to obtain a better performance if he knows the motion properties. Roherton's studies show the children motion evolution2 . Several investigations methods are able to measure the human movement But now most of the studies are based on image processing. Often the systems are working at the T.V. standard (50 frame per secund ). they permit only to study very slow gesture. A human operator analyses the digitizing sequence of the film manually giving a very expensive, especially long and unprecise operation. On these different grounds many human movement analysis systems were implemented. They consist of: - markers which are fixed to the anatomical interesting points on the subject in motion, - Image compression which is the art to coding picture data. Generally the compression Is limited to the centroid coordinates calculation tor each marker. These systems differ from one other in image acquisition and markers detection.

  4. A pathway analysis of global aerosol processes

    NASA Astrophysics Data System (ADS)

    Schutgens, Nick; Stier, Philip

    2014-05-01

    smaller modes. Our analysis also suggests that coagulation serves mainly as a loss process for number densities and that it is a relatively unimportant contributor to composition changes of aerosol. Our results provide an objective way of complexity analysis in a global aerosol model and will be used in future work where we will reduce this complexity in ECHAM-HAM.

  5. Data Acquisition and Mass Storage

    NASA Astrophysics Data System (ADS)

    Vande Vyvre, P.

    2004-08-01

    The experiments performed at supercolliders will constitute a new challenge in several disciplines of High Energy Physics and Information Technology. This will definitely be the case for data acquisition and mass storage. The microelectronics, communication, and computing industries are maintaining an exponential increase of the performance of their products. The market of commodity products remains the largest and the most competitive market of technology products. This constitutes a strong incentive to use these commodity products extensively as components to build the data acquisition and computing infrastructures of the future generation of experiments. The present generation of experiments in Europe and in the US already constitutes an important step in this direction. The experience acquired in the design and the construction of the present experiments has to be complemented by a large R&D effort executed with good awareness of industry developments. The future experiments will also be expected to follow major trends of our present world: deliver physics results faster and become more and more visible and accessible. The present evolution of the technologies and the burgeoning of GRID projects indicate that these trends will be made possible. This paper includes a brief overview of the technologies currently used for the different tasks of the experimental data chain: data acquisition, selection, storage, processing, and analysis. The major trends of the computing and networking technologies are then indicated with particular attention paid to their influence on the future experiments. Finally, the vision of future data acquisition and processing systems and their promise for future supercolliders is presented.

  6. Eco-Efficiency Analysis of biotechnological processes.

    PubMed

    Saling, Peter

    2005-07-01

    Eco-Efficiency has been variously defined and analytically implemented by several workers. In most cases, Eco-Efficiency is taken to mean the ecological optimization of overall systems while not disregarding economic factors. Eco-Efficiency should increase the positive ecological performance of a commercial company in relation to economic value creation--or to reduce negative effects. Several companies use Eco-Efficiency Analysis for decision-making processes; and industrial examples of best practices in developing and implementing Eco-Efficiency have been reviewed. They clearly demonstrate the environmental and business benefits of Eco-Efficiency. An instrument for the early recognition and systematic detection of economic and environmental opportunities and risks for production processes in the chemical industry began use in 1997, since when different new features have been developed, leading to many examples. This powerful Eco-Efficiency Analysis allows a feasibility evaluation of existing and future business activities and is applied by BASF. In many cases, decision-makers are able to choose among alternative processes for making a product.

  7. Bone feature analysis using image processing techniques.

    PubMed

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  8. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  9. Quantitative analysis of geomorphic processes using satellite image data at different scales

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  10. Search strategy effects on PN acquisition performance. [Pseudonoise

    NASA Technical Reports Server (NTRS)

    Weinberg, A.

    1981-01-01

    The present paper focusses on 'random' and 'expanding window' PN acquisition search strategies and analytically develops the PN acquisition time statistics as functions of salient system parameters such as prediction SNR, detection and false alarm probabilities and a priori information on epoch location. The significance of this analysis is its general applicability to arbitrary postdetection processing schemes. Computed performance results account for the above salient parameters, wherein sequential detection is employed in conjunction with random and selected expanding window search strategies.

  11. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  12. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  13. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    SciTech Connect

    Y. SUN

    2004-09-29

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P&CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P&CE (BSC 2004 [DIRS 169860

  14. Methodological Reflections on Gesture Analysis in Second Language Acquisition and Bilingualism Research

    ERIC Educational Resources Information Center

    Gullberg, Marianne

    2010-01-01

    Gestures, i.e. the symbolic movements that speakers perform while they speak, form a closely interconnected system with speech, where gestures serve both addressee-directed ("communicative") and speaker-directed ("internal") functions. This article aims (1) to show that a combined analysis of gesture and speech offers new ways to address…

  15. Mathematical Analysis and Optimization of Infiltration Processes

    NASA Technical Reports Server (NTRS)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  16. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  17. The Effect of Data Acquisition-Probeware and Digital Video Analysis on Accurate Graphical Representation of Kinetics in a High School Physics Class

    ERIC Educational Resources Information Center

    Struck, William; Yerrick, Randy

    2010-01-01

    The effects of two types of two well-established microcomputer-based teaching methods were examined for their effect on teaching high school students kinetics. The use of data acquisition probeware and digital video analysis were studied for their impact on students' conceptions and ability to interpret graphical relationships to real world…

  18. A Cost Analysis of a State-Level, Multi-Option Procedure for the Acquisition of Competency-Based Instructional Materials.

    ERIC Educational Resources Information Center

    Massey, Romeo M.

    To assist the Division of Vocational Education (DVE) of the Florida State Department of Education, the author conducted a cost analysis of state-level models for acquiring competency-based instructional materials in vocational education. Two DVE-developed models of materials acquisition were examined: a centralized, single-option model in which…

  19. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  20. Data Acquisition System for Instructional Spectroscopes

    NASA Astrophysics Data System (ADS)

    Almeida, C. B. S. B.; Hetem, A.

    2014-10-01

    This article aims to present the software for data acquisition developed in scientific initiation program - IC, for use in the design of a spectrometer built by students. The program was built in C++, a language in wide use today. The origin of spectra used is a simplified model of rustic spectroscope. This equipment basically consists of a box that does not allow light to enter, except through a slit made in the side of it, a diffraction media and a camera for data acquisition. After the image acquisition, one executes the data processing, followed by the usual steps of reduction and analysis of this type of tool. We have implemented a method for calibrating the spectroscope, through which one can compare the incidence of the photons with characteristic of each monochromatic wave. The final result is a one-dimensional spectrum that can be subsequently analyzed.

  1. Description of a Portable Wireless Device for High-Frequency Body Temperature Acquisition and Analysis

    PubMed Central

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility. PMID:22408473

  2. The design and instrumentation of the Purdue annular cascade facility with initial data acquisition and analysis

    NASA Technical Reports Server (NTRS)

    Stauter, R. C.; Fleeter, S.

    1982-01-01

    Three dimensional aerodynamic data, required to validate and/or indicate necessary refinements to inviscid and viscous analyses of the flow through turbomachine blade rows, are discussed. Instrumentation and capabilities for pressure measurement, probe insertion and traversing, and flow visualization are reviewed. Advanced measurement techniques including Laser Doppler Anemometers, are considered. Data processing is reviewed. Predictions were correlated with the experimental data. A flow visualization technique using helium filled soap bubbles was demonstrated.

  3. Comparative analysis of two systems for unobtrusive heart signal acquisition and characterization.

    PubMed

    Postolache, Octavian A; Girão, Pedro S; Postolache, Gabriela

    2013-01-01

    In this paper we describe and compared method of heart rate estimation from cardiac signal acquired with EMFIT, FMCW Doppler radar and Finapres based technology, in the same context, and briefly investigated their similarities and differences. Study of processing of acquired cardiac signal for accurate peak detection using Wavelet Transform is also described. The results suggest good reliability of the two implemented unobtrusive systems for heart rate estimation. PMID:24111361

  4. Comparative analysis of two systems for unobtrusive heart signal acquisition and characterization.

    PubMed

    Postolache, Octavian A; Girão, Pedro S; Postolache, Gabriela

    2013-01-01

    In this paper we describe and compared method of heart rate estimation from cardiac signal acquired with EMFIT, FMCW Doppler radar and Finapres based technology, in the same context, and briefly investigated their similarities and differences. Study of processing of acquired cardiac signal for accurate peak detection using Wavelet Transform is also described. The results suggest good reliability of the two implemented unobtrusive systems for heart rate estimation.

  5. Seismic Data Analysis to the Converted Wave Acquisition: A Case Study in Offshore Malaysia

    NASA Astrophysics Data System (ADS)

    Latiff, A. H. Abdul; Osman, S. A. A.; Jamaludin, S. N. F.

    2016-07-01

    Many fields in offshore Malaysia suffer from the presence of shallow gas cloud which is one of the major issues in the basin. Seismic images underneath the gas cloud often show poor resolution which makes the geophysical and geological interpretation difficult. This effect can be noticed from the amplitude dimming, loss of high-frequency energy, and phase distortion. In this work, the subsurface will be analyzed through the geophysical interpretation of the converted P-S data. This P-S converted dataset was obtained through ocean bottom cable (OBC) procedure which was conducted at a shallow gas affected field located in Malaysian Basin. The geophysical interpretation process begin by picking the clear faults system and horizons, followed by thorough post-stack seismic data processing procedure. Finally, the attributes analyses were implemented to the seismic section in order to image the unseen faults system. The interpreted seismic sections show significant improvement in the seismic images, particularly through median filter process. Moreover, the combination of structural smoothing and variance procedure had contributed to the correct faults location interpretation.

  6. CT acquisition technique and quantitative analysis of the lung parenchyma: variability and corrections

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Leader, J. K.; Coxson, Harvey O.; Scuirba, Frank C.; Fuhrman, Carl R.; Balkan, Arzu; Weissfeld, Joel L.; Maitz, Glenn S.; Gur, David

    2006-03-01

    The fraction of lung voxels below a pixel value "cut-off" has been correlated with pathologic estimates of emphysema. We performed a "standard" quantitative CT (QCT) lung analysis using a -950 HU cut-off to determine the volume fraction of emphysema (below the cut-off) and a "corrected" QCT analysis after removing small group (5 and 10 pixels) of connected pixels ("blobs") below the cut-off. CT examinations two dataset of 15 subjects each with a range of visible emphysema and pulmonary obstruction were acquired at "low-dose and conventional dose reconstructed using a high-spatial frequency kernel at 2.5 mm section thickness for the same subject. The "blob" size (i.e., connected-pixels) removed was inversely related to the computed fraction of emphysema. The slopes of emphysema fraction versus blob size were 0.013, 0.009, and 0.005 for subjects with both no emphysema and no pulmonary obstruction, moderate emphysema and pulmonary obstruction, and severe emphysema and severe pulmonary obstruction, respectively. The slopes of emphysema fraction versus blob size were 0.008 and 0.006 for low-dose and conventional CT examinations, respectively. The small blobs of pixels removed are most likely CT image artifacts and do not represent actual emphysema. The magnitude of the blob correction was appropriately associated with COPD severity. The blob correction appears to be applicable to QCT analysis in low-dose and conventional CT exams.

  7. Auxetic polyurethane foam: Manufacturing and processing analysis

    NASA Astrophysics Data System (ADS)

    Jahan, Md Deloyer

    experimental design approach to identify significant processing parameters followed by optimization of those processing parameters in fabrication of auxetic PU foam. A split-plot factorial design has been selected for screening purpose. Response Surface Methodology (RSM) has been utilized to optimize the processing parameters in fabrication of auxetic PU foam. Two different designs named Box-Behnken and I-optimal designs have been employed for this analysis. The results obtained by those designs exhibit that I-optimal design provides more accurate and realistic results than Box-Behnken design when experiments are performed in split-plot manner. Finally, a near stationary ridge system is obtained by optimization analysis. As a result a set of operating conditions are obtained that produces similar minimum Poisson's ratio in auxetic PU foam.

  8. Split-screen display system and standardized methods for ultrasound image acquisition and multi-frame data processing

    NASA Technical Reports Server (NTRS)

    Selzer, Robert H. (Inventor); Hodis, Howard N. (Inventor)

    2011-01-01

    A standardized acquisition methodology assists operators to accurately replicate high resolution B-mode ultrasound images obtained over several spaced-apart examinations utilizing a split-screen display in which the arterial ultrasound image from an earlier examination is displayed on one side of the screen while a real-time "live" ultrasound image from a current examination is displayed next to the earlier image on the opposite side of the screen. By viewing both images, whether simultaneously or alternately, while manually adjusting the ultrasound transducer, an operator is able to bring into view the real-time image that best matches a selected image from the earlier ultrasound examination. Utilizing this methodology, dynamic material properties of arterial structures, such as IMT and diameter, are measured in a standard region over successive image frames. Each frame of the sequence has its echo edge boundaries automatically determined by using the immediately prior frame's true echo edge coordinates as initial boundary conditions. Computerized echo edge recognition and tracking over multiple successive image frames enhances measurement of arterial diameter and IMT and allows for improved vascular dimension measurements, including vascular stiffness and IMT determinations.

  9. Acquisition of the linearization process in text composition in third to ninth graders: effects of textual superstructure and macrostructural organization.

    PubMed

    Favart, Monik; Coirier, Pierre

    2006-07-01

    Two complementary experiments analyzed the acquisition of text content linearization in writing, in French-speaking participants from third to ninth grades. In both experiments, a scrambled text paradigm was used: eleven ideas presented in random order had to be rearranged coherently so as to compose a text. Linearization was analyzed on the basis of the conceptual ordering of ideas and writing fluency. The first experiment focused on the effect of superstructural facilitation (in decreasing order: 1--instructional, 2--narrative, 3--argumentative), while the second experiment studied the effect of prewriting conditions: 1-scrambled presentation, 2--macrostructural facilitation, 3--ideas given in optimal order (control condition). As expected, scores in conceptual ordering and writing fluency improved through the grade levels. Students were most successful with respect to conceptual ordering in the instructional superstructure, followed by the narrative and finally the argumentative superstructures. The prewriting assignment also had the expected effect (control better than macrostructural presentation which, in turn, was better than the random order) but only with the argumentative superstructure. Contrary to conceptual ordering, writing fluency was not affected by the type of superstructure, although we did record an effect of the prewriting condition. The results are discussed in light of Bereiter and Scardamalia's knowledge transforming strategy (1987) taking into account cognitive development and French language curriculum.

  10. Stability analysis of a polymer coating process

    NASA Astrophysics Data System (ADS)

    Kallel, A.; Hachem, E.; Demay, Y.; Agassant, J. F.

    2015-05-01

    A new coating process involving a short stretching distance (1 mm) and a high draw ratio (around 200) is considered. The resulting thin molten polymer film (around 10 micrometers) is set down on a solid primary film and then covered by another solid secondary film. In experimental studies, periodical fluctuation in the thickness of the coated layer may be observed. The processing conditions markedly influence the onset and the development of these defects and modeling will help our understanding of their origins. The membrane approach which has been commonly used for cast film modeling is no longer valid and two dimensional time dependent models (within the thickness) are developed in the whole domain (upstream die and stretching path). A boundary-value problem with a free surface for the Stokes equations is considered and stability of the free surface is assessed using two different numerical strategies: a tracking strategy combined with linear stability analysis involving computation of leading eigenvalues, and a Level Set capturing strategy coupled with transient stability analysis.

  11. Merger and acquisition medicine.

    PubMed

    Powell, G S

    1997-01-01

    This discussion of the ramifications of corporate mergers and acquisitions for employees recognizes that employee adaptation to the change can be a long and complex process. The author describes a role the occupational physician can take in helping to minimize the potential adverse health impact of major organizational change.

  12. Second Language Acquisition.

    ERIC Educational Resources Information Center

    McLaughlin, Barry; Harrington, Michael

    1989-01-01

    A distinction is drawn between representational and processing models of second-language acquisition. The first approach is derived primarily from linguistics, the second from psychology. Both fields, it is argued, need to collaborate more fully, overcoming disciplinary narrowness in order to achieve more fruitful research. (GLR)

  13. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape. PMID:17471805

  14. The effects of videotape modeling on staff acquisition of functional analysis methodology.

    PubMed

    Moore, James W; Fisher, Wayne W

    2007-01-01

    Lectures and two types of video modeling were compared to determine their relative effectiveness in training 3 staff members to conduct functional analysis sessions. Video modeling that contained a larger number of therapist exemplars resulted in mastery-level performance eight of the nine times it was introduced, whereas neither lectures nor partial video modeling produced significant improvements in performance. Results demonstrated that video modeling provided an effective training strategy but only when a wide range of exemplars of potential therapist behaviors were depicted in the videotape.

  15. Method for Visually Integrating Multiple Data Acquisition Technologies for Real Time and Retrospective Analysis

    NASA Technical Reports Server (NTRS)

    Bogart, Edward H. (Inventor); Pope, Alan T. (Inventor)

    2000-01-01

    A system for display on a single video display terminal of multiple physiological measurements is provided. A subject is monitored by a plurality of instruments which feed data to a computer programmed to receive data, calculate data products such as index of engagement and heart rate, and display the data in a graphical format simultaneously on a single video display terminal. In addition live video representing the view of the subject and the experimental setup may also be integrated into the single data display. The display may be recorded on a standard video tape recorder for retrospective analysis.

  16. Acquisition of a High Resolution Field Emission Scanning Electron Microscope for the Analysis of Returned Samples

    NASA Technical Reports Server (NTRS)

    Nittler, Larry R.

    2003-01-01

    This grant furnished funds to purchase a state-of-the-art scanning electron microscope (SEM) to support our analytical facilities for extraterrestrial samples. After evaluating several instruments, we purchased a JEOL 6500F thermal field emission SEM with the following analytical accessories: EDAX energy-dispersive x-ray analysis system with fully automated control of instrument and sample stage; EDAX LEXS wavelength-dispersive x-ray spectrometer for high sensitivity light-element analysis; EDAX/TSL electron backscatter diffraction (EBSD) system with software for phase identification and crystal orientation mapping; Robinson backscatter electron detector; and an in situ micro-manipulator (Kleindiek). The total price was $550,000 (with $150,000 of the purchase supported by Carnegie institution matching funds). The microscope was delivered in October 2002, and most of the analytical accessories were installed by January 2003. With the exception of the wavelength spectrometer (which has been undergoing design changes) everything is working well and the SEM is in routine use in our laboratory.

  17. An innovative experiment on superconductivity, based on video analysis and non-expensive data acquisition

    NASA Astrophysics Data System (ADS)

    Bonanno, A.; Bozzo, G.; Camarca, M.; Sapia, P.

    2015-07-01

    In this paper we present a new experiment on superconductivity, designed for university undergraduate students, based on the high-speed video analysis of a magnet falling through a ceramic superconducting cylinder (Tc = 110 K). The use of an Atwood’s machine allows us to vary the magnet’s speed and acceleration during its interaction with the superconductor. In this way, we highlight the existence of two interaction regimes: for low crossing energy, the magnet is levitated by the superconductor after a transient oscillatory damping; for higher crossing energy, the magnet passes through the superconducting cylinder. The use of a commercial-grade high speed imaging system, together with video analysis performed using the Tracker software, allows us to attain a good precision in space and time measurements. Four sensing coils, mounted inside and outside the superconducting cylinder, allow us to study the magnetic flux variations in connection with the magnet’s passage through the superconductor, permitting us to shed light on a didactically relevant topic as the behaviour of magnetic field lines in the presence of a superconductor. The critical discussion of experimental data allows undergraduate university students to grasp useful insights on the basic phenomenology of superconductivity as well as on relevant conceptual topics such as the difference between the Meissner effect and the Faraday-like ‘perfect’ induction.

  18. Language Acquisition, Pidgins and Creoles.

    ERIC Educational Resources Information Center

    Wode, Henning

    1981-01-01

    Suggests that structural universals between different-based pidgins result from universal linguo-cognitive processing strategies which are employed in learning languages. Some of the strategies occur in all types of acquisition, and others are more applicable to L2 type acquisition. Past research is discussed, and some exemplary data are given.…

  19. Data acquisition and analysis of range-finding systems for spacing construction

    NASA Technical Reports Server (NTRS)

    Shen, C. N.

    1981-01-01

    For space missions of future, completely autonomous robotic machines will be required to free astronauts from routine chores of equipment maintenance, servicing of faulty systems, etc. and to extend human capabilities in hazardous environments full of cosmic and other harmful radiations. In places of high radiation and uncontrollable ambient illuminations, T.V. camera based vision systems cannot work effectively. However, a vision system utilizing directly measured range information with a time of flight laser rangefinder, can successfully operate under these environments. Such a system will be independent of proper illumination conditions and the interfering effects of intense radiation of all kinds will be eliminated by the tuned input of the laser instrument. Processing the range data according to certain decision, stochastic estimation and heuristic schemes, the laser based vision system will recognize known objects and thus provide sufficient information to the robot's control system which can develop strategies for various objectives.

  20. Data Acquisition, Analysis and Transmission Platform for a Pay-As-You-Drive System

    PubMed Central

    Boquete, Luciano; Rodríguez-Ascariz, José Manuel; Barea, Rafael; Cantos, Joaquín; Miguel-Jiménez, Juan Manuel; Ortega, Sergio

    2010-01-01

    This paper presents a platform used to acquire, analyse and transmit data from a vehicle to a Control Centre as part of a Pay-As-You-Drive system. The aim is to monitor vehicle usage (how much, when, where and how) and, based on this information, assess the associated risk and set an appropriate insurance premium. To determine vehicle usage, the system analyses the driver’s respect for speed limits, driving style (aggressive or non-aggressive), mobile telephone use and the number of vehicle passengers. An electronic system on board the vehicle acquires these data, processes them and transmits them by mobile telephone (GPRS/UMTS) to a Control Centre, at which the insurance company assesses the risk associated with vehicles monitored by the system. The system provides insurance companies and their customers with an enhanced service and could potentially increase responsible driving habits and reduce the number of road accidents. PMID:22219668

  1. A Wearable Cardiac Monitor for Long-Term Data Acquisition and Analysis

    PubMed Central

    Winokur, Eric S.; Delano, Maggie K.; Sodini, Charles G.

    2015-01-01

    A low-power wearable ECG monitoring system has been developed entirely from discrete electronic components and a custom PCB. This device removes all loose wires from the system and minimizes the footprint on the user. The monitor consists of five electrodes, which allow a cardiologist to choose from a variety of possible projections. Clinical tests to compare our wearable monitor with a commercial clinical ECG recorder are conducted on ten healthy adults under different ambulatory conditions, with nine of the datasets used for analysis. Data from both monitors were synchronized and annotated with PhysioNet's waveform viewer WAVE (physionet.org) [1]. All gold standard annotations are compared to the results of the WQRS detection algorithm [2] provided by PhysioNet. QRS sensitivity and QRS positive predictability are extracted from both monitors to validate the wearable monitor. PMID:22968205

  2. Spatio-temporal registration in multiplane MRI acquisitions for 3D colon motiliy analysis

    NASA Astrophysics Data System (ADS)

    Kutter, Oliver; Kirchhoff, Sonja; Berkovich, Marina; Reiser, Maximilian; Navab, Nassir

    2008-03-01

    In this paper we present a novel method for analyzing and visualizing dynamic peristaltic motion of the colon in 3D from two series of differently oriented 2D MRI images. To this end, we have defined an MRI examination protocol, and introduced methods for spatio-temporal alignment of the two MRI image series into a common reference. This represents the main contribution of this paper, which enables the 3D analysis of peristaltic motion. The objective is to provide a detailed insight into this complex motion, aiding in the diagnosis and characterization of colon motion disorders. We have applied the proposed spatio-temporal method on Cine MRI data sets of healthy volunteers. The results have been inspected and validated by an expert radiologist. Segmentation and cylindrical approximation of the colon results in a 4D visualization of the peristaltic motion.

  3. How Students Learn: The Validation of a Model of Knowledge Acquisition Using Stimulated Recall of the Learning Process.

    ERIC Educational Resources Information Center

    Nuthall, Graham

    A study of students' thinking processes during their engagement in classroom tasks in science and social studies units in upper elementary school classrooms was conducted as part of a series of studies on learning. As a result of previous studies, a theory of the learning process has been developed. A central component of the theory is the…

  4. 77 FR 9617 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... Supplement (DFARS) to update DoD's voucher processing procedures and better accommodate the use of Wide Area WorkFlow to process vouchers. DATES: Comments on the proposed rule published January 19, 2012, at 77 FR... clarifying the proposed rule published on January 19, 2012 (77 FR 2682), which proposes to...

  5. 77 FR 52258 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... a proposed rule in the Federal Register at 77 FR 2682 on January 19, 2012. The comment period closed... the Wide Area WorkFlow (WAWF) used to process vouchers. DATES: August 29, 2012. FOR FURTHER... rule merely updates DoD's voucher processing procedures and better accommodates the Wide Area...

  6. Optimizing the acquisition and analysis of confocal images for quantitative single-mobile-particle detection.

    PubMed

    Friaa, Ouided; Furukawa, Melissa; Shamas-Din, Aisha; Leber, Brian; Andrews, David W; Fradin, Cécile

    2013-08-01

    Quantification of the fluorescence properties of diffusing particles in solution is an invaluable source of information for characterizing the interactions, stoichiometry, or conformation of molecules directly in their native environment. In the case of heterogeneous populations, single-particle detection should be the method of choice and it can, in principle, be achieved by using confocal imaging. However, the detection of single mobile particles in confocal images presents specific challenges. In particular, it requires an adapted set of imaging parameters for capturing the confocal images and an adapted event-detection scheme for analyzing the image. Herein, we report a theoretical framework that allows a prediction of the properties of a homogenous particle population. This model assumes that the particles have linear trajectories with reference to the confocal volume, which holds true for particles with moderate mobility. We compare the predictions of our model to the results as obtained by analyzing the confocal images of solutions of fluorescently labeled liposomes. Based on this comparison, we propose improvements to the simple line-by-line thresholding event-detection scheme, which is commonly used for single-mobile-particle detection. We show that an optimal combination of imaging and analysis parameters allows the reliable detection of fluorescent liposomes for concentrations between 1 and 100 pM. This result confirms the importance of confocal single-particle detection as a complementary technique to ensemble fluorescence-correlation techniques for the studies of mobile particle.

  7. Patent Analysis for Supporting Merger and Acquisition (M&A) Prediction: A Data Mining Approach

    NASA Astrophysics Data System (ADS)

    Wei, Chih-Ping; Jiang, Yu-Syun; Yang, Chin-Sheng

    M&A plays an increasingly important role in the contemporary business environment. Companies usually conduct M&A to pursue complementarity from other companies for preserving and/or extending their competitive advantages. For the given bidder company, a critical first step to the success of M&A activities is the appropriate selection of target companies. However, existing studies on M&A prediction incur several limitations, such as the exclusion of technological variables in M&A prediction models and the omission of the profile of the respective bidder company and its compatibility with candidate target companies. In response to these limitations, we propose an M&A prediction technique which not only encompasses technological variables derived from patent analysis as prediction indictors but also takes into account the profiles of both bidder and candidate target companies when building an M&A prediction model. We collect a set of real-world M&A cases to evaluate the proposed technique. The evaluation results are encouraging and will serve as a basis for future studies.

  8. Acquisition and disclosure of genetic information under alternative policy regimes: an economic analysis.

    PubMed

    Wilson, Deborah

    2006-07-01

    A current policy issue is whether, and if so under what circumstances, insurance companies should be given access to genetic test results. The insurance industry argues for mandatory disclosure in order to avoid problems of adverse selection; an alternative would be a moratorium or legislation preventing such disclosure; a third option a voluntary disclosure law. This paper investigates the impact of these policies on individuals' incentives to both acquire genetic information and to disclose it to providers of health and/or life insurance. The theoretical framework used to inform this analysis is provided by the 'games of persuasion' literature, in which one agent tries to influence another agent's decision by selectively withholding her private information regarding quality. The application of the theoretical framework to this policy context yields the following results. Individuals have the incentive to acquire genetic information and to disclose the test results if disclosure is voluntary. If, however, they are obliged to disclose the results of any genetic tests they have taken, their incentive may be not to acquire such information. I discuss the policy implications of these findings both from the point of view of the insurance industry and from a public health perspective.

  9. Super-resolved image acquisition with full-field localization-based microscopy: theoretical analysis and evaluation

    NASA Astrophysics Data System (ADS)

    Son, Taehwang; Lee, Wonju; Kim, Donghyun

    2016-02-01

    We analyze and evaluate super-resolved image acquisition with full-field localization microscopy in which an individual signal sampled by localization may or may not be switched. For the analysis, Nyquist-Shannon sampling theorem based on ideal delta function was extended to sampling with unit pulse comb and surface-enhanced localized near-field that was numerically calculated with finite difference time domain. Sampling with unit pulse was investigated in Fourier domain where magnitude of baseband becomes larger than that of adjacent subband, i.e. aliasing effect is reduced owing to pulse width. Standard Lena image was employed as imaging target and a diffraction-limited optical system is assumed. A peak signal-to-noise ratio (PSNR) was introduced to evaluate the efficiency of image reconstruction quantitatively. When the target was sampled without switching by unit pulse as the sampling width and period are varied, PSNR increased eventually to 18.1 dB, which is the PSNR of a conventional diffraction-limited image. PSNR was found to increase with a longer pulse width due to reduced aliasing effect. When switching of individual sampling pulses was applied, blurry artifact outside the excited field is removed for each pulse and PSNR soars to 25.6 dB with a shortened pulse period, i.e. effective resolution of 72 nm is obtained, which can further be decreased.

  10. Analysis of cure in composites processing

    NASA Technical Reports Server (NTRS)

    Aylward, L.; Roylance, D.; Douglas, C.

    1984-01-01

    Finite element analysis is a general numerical tool for solving the field equations of engineering practice, and this paper demonstrates its use in modeling the nonisothermal cure of pultruded composite material. A very simple grid is used in this case to model a narrow strip of material, and this grid is then solved using a time-stepping transient algorithm to simulate the passage of the strip along the pultruder die. As time proceeds, heat is conducted into the strip from the heated boundaries at the die walls, and cure proceeds at a rate dependent on the local temperature. The computer model can be used to minimize the time needed for sufficient cure, and helps avoid such processing errors as undercure or thermal degradation.

  11. Digital data processing system dynamic loading analysis

    NASA Technical Reports Server (NTRS)

    Lagas, J. J.; Peterka, J. J.; Tucker, A. E.

    1976-01-01

    Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.

  12. A point process analysis of sensory encoding.

    PubMed

    Stanley, Garrett B; Webber, Roxanna M

    2003-01-01

    Nowhere is the sparse nature of neuronal coding more evident than in the sensory cortex, where neuronal response becomes increasingly tuned to specific features of the sensory environment. For such situations, where rate modulation schemes do not accurately describe the neuronal response to sensory stimuli, statistical descriptions based on point process events are particularly appropriate. Here, intensity measures derived from experimental data in the rat somatosensory cortex enable the direct analysis of statistical structure within spike trains, as well as inter-relationships between tactile stimuli and neuronal response. Intensity measures capture structure in spontaneous as well as driven activity, reflecting the interplay between excitatory and suppressive influences on neuronal firing. Second-order intensity estimates reveal strong dependencies upon patterns of tactile stimulation, which define the neuronal response characteristics to temporally structured stimuli.

  13. Multienvironment quantitative trait Loci analysis for photosynthate acquisition, accumulation, and remobilization traits in common bean under drought stress.

    PubMed

    Asfaw, Asrat; Blair, Matthew W; Struik, Paul C

    2012-05-01

    Many of the world's common bean (Phaseolus vulgaris L.) growing regions are prone to either intermittent or terminal drought stress, making drought the primary cause of yield loss under farmers' field conditions. Improved photosynthate acquisition, accumulation, and then remobilization have been observed as important mechanisms for adaptation to drought stress. The objective of this study was to tag quantitative trait loci (QTL) for photosynthate acquisition, accumulation, and remobilization to grain by using a recombinant inbred line population developed from the Mesoamerican intragenepool cross of drought-susceptible DOR364 and drought-tolerant BAT477 grown under eight environments differing in drought stress across two continents: Africa and South America. The recombinant inbred line population expressed quantitative variation and transgressive segregation for 11 traits associated with drought tolerance. QTL were detected by both a mixed multienvironment model and by composite interval mapping for each environment using a linkage map constructed with 165 genetic markers that covered 11 linkage groups of the common bean genome. In the multienvironment, mixed model, nine QTL were detected for 10 drought stress tolerance mechanism traits found on six of the 11 linkage groups. Significant QTL × environment interaction was observed for six of the nine QTL. QTL × environment interaction was of the cross-over type for three of the six significant QTL with contrasting effect of the parental alleles across different environments. In the composite interval mapping, we found 69 QTL in total. The majority of these were found for Palmira (47) or Awassa (18), with fewer in Malawi (4). Phenotypic variation explained by QTL in single environments ranged up to 37%, and the most consistent QTL were for Soil Plant Analysis Development (SPAD) leaf chlorophyll reading and pod partitioning traits. QTL alignment between the two detection methods showed that yield QTL on b08 and stem

  14. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process for... architect-engineer firms in accordance with 36.602-3, except that the selection report shall serve as...

  15. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process for....602-5 may be used to select firms for architect-engineer contracts that are not expected to exceed...

  16. Hospitals changing their buying habits. Overhauled technology-acquisition processes help equip facilities to make prudent purchases.

    PubMed

    Wagner, M

    1990-11-26

    As hospitals face increasing pressure to rein in costs, equipment spending faces stiff competition for limited funds. When facilities replace aging or outdated equipment, they're often replacing the entire technology assessment process as well. One hospital facing a $4 million bill to equip a new building is revamping its purchasing process based on department "wish lists." And an Ohio system has formed a special division to speed assessment and implementation of new technologies and procedures.

  17. Decision analysis applications and the CERCLA process

    SciTech Connect

    Purucker, S.T.; Lyon, B.F. |

    1994-06-01

    Quantitative decision methods can be developed during environmental restoration projects that incorporate stakeholder input and can complement current efforts that are undertaken for data collection and alternatives evaluation during the CERCLA process. These decision-making tools can supplement current EPA guidance as well as focus on problems that arise as attempts are made to make informed decisions regarding remedial alternative selection. In examining the use of such applications, the authors discuss the use of decision analysis tools and their impact on collecting data and making environmental decisions from a risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based decision rules that incorporate stakeholder concerns. This represents a quantitative method for implementing the Data Quality Objective (DQO) process. These objective functions can be expressed using a variety of indices to analyze problems that currently arise in the environmental field. Examples include cost, magnitude of risk, efficiency, and probability of success or failure. Based on such defined objective functions, a project can evaluate the impact of different risk and decision selection strategies on data worth and alternative selection.

  18. Thermodynamic Analysis of Nanoporous Membrane Separation Processes

    NASA Astrophysics Data System (ADS)

    Rogers, David; Rempe, Susan

    2011-03-01

    We give an analysis of desalination energy requirements in order to quantify the potential for future improvements in desalination membrane technology. Our thermodynamic analysis makes it possible to draw conclusions from the vast array of equilibrium molecular dynamics simulations present in the literature as well as create a standardized comparison for measuring and reporting experimental reverse osmosis material efficiency. Commonly employed methods for estimating minimum desalination energy costs have been revised to include operations at positive input stream recovery ratios using a thermodynamic cycle analogous to the Carnot cycle. Several gaps in the statistical mechanical theory of irreversible processes have also been identified which may in the future lead to improved communication between materials engineering models and statistical mechanical simulation. Simulation results for silica surfaces and nanochannels are also presented. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  19. XSTREAM: A Highly Efficient High Speed Real-time Satellite Data Acquisition and Processing System using Heterogeneous Computing

    NASA Astrophysics Data System (ADS)

    Pramod Kumar, K.; Mahendra, P.; Ramakrishna rReddy, V.; Tirupathi, T.; Akilan, A.; Usha Devi, R.; Anuradha, R.; Ravi, N.; Solanki, S. S.; Achary, K. K.; Satish, A. L.; Anshu, C.

    2014-11-01

    In the last decade, the remote sensing community has observed a significant growth in number of satellites, sensors and their resolutions, thereby increasing the volume of data to be processed each day. Satellite data processing is a complex and time consuming activity. It consists of various tasks, such as decode, decrypt, decompress, radiometric normalization, stagger corrections, ephemeris data processing for geometric corrections etc., and finally writing of the product in the form of an image file. Each task in the processing chain is sequential in nature and has different computing needs. Conventionally the processes are cascaded in a well organized workflow to produce the data products, which are executed on general purpose high-end servers / workstations in an offline mode. Hence, these systems are considered to be ineffective for real-time applications that require quick response and just-intime decision making such as disaster management, home land security and so on. This paper discusses anovel approach to processthe data online (as the data is being acquired) using a heterogeneous computing platform namely XSTREAM which has COTS hardware of CPUs, GPUs and FPGA. This paper focuses on the process architecture, re-engineering aspects and mapping of tasks to the right computing devicewithin the XSTREAM system, which makes it an ideal cost-effective platform for acquiring, processing satellite payload data in real-time and displaying the products in original resolution for quick response. The system has been tested for IRS CARTOSAT and RESOURCESAT series of satellites which have maximum data downlink speed of 210 Mbps.

  20. A Research on Second Language Acquisition and College English Teaching

    ERIC Educational Resources Information Center

    Li, Changyu

    2009-01-01

    It was in the 1970s that American linguist S.D. Krashen created the theory of "language acquisition". The theories on second language acquisition were proposed based on the study on the second language acquisition process and its rules. Here, the second language acquisition process refers to the process in which a learner with the…