Science.gov

Sample records for acquisition processing analysis

  1. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  2. SNAP: Simulating New Acquisition Processes

    NASA Technical Reports Server (NTRS)

    Alfeld, Louis E.

    1997-01-01

    Simulation models of acquisition processes range in scope from isolated applications to the 'Big Picture' captured by SNAP technology. SNAP integrates a family of models to portray the full scope of acquisition planning and management activities, including budgeting, scheduling, testing and risk analysis. SNAP replicates the dynamic management processes that underlie design, production and life-cycle support. SNAP provides the unique 'Big Picture' capability needed to simulate the entire acquisition process and explore the 'what-if' tradeoffs and consequences of alternative policies and decisions. Comparison of cost, schedule and performance tradeoffs help managers choose the lowest-risk, highest payoff at each step in the acquisition process.

  3. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    SciTech Connect

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B., Jr.; Penaflor, B.G.

    1999-06-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters.

  4. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  5. Image acquisitions, processing and analysis in the process of obtaining characteristics of horse navicular bone

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Włodarek, J.; Przybylak, A.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Boniecki, P.; Koszela, K.; Przybył, J.; Skwarcz, J.

    2015-07-01

    The aim of this study was investigate the possibility of using methods of computer image analysis for the assessment and classification of morphological variability and the state of health of horse navicular bone. Assumption was that the classification based on information contained in the graphical form two-dimensional digital images of navicular bone and information of horse health. The first step in the research was define the classes of analyzed bones, and then using methods of computer image analysis for obtaining characteristics from these images. This characteristics were correlated with data concerning the animal, such as: side of hooves, number of navicular syndrome (scale 0-3), type, sex, age, weight, information about lace, information about heel. This paper shows the introduction to the study of use the neural image analysis in the diagnosis of navicular bone syndrome. Prepared method can provide an introduction to the study of non-invasive way to assess the condition of the horse navicular bone.

  6. Data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Tsuda, Toshitaka

    1989-10-01

    Fundamental methods of signal processing used in normal mesosphere stratosphere troposphere (MST) radar observations are described. Complex time series of received signals obtained in each range gate are converted into Doppler spectra, from which the mean Doppler shift, spectral width and signal-to-noise ratio (SNR) are estimated. These spectral parameters are further utilized to study characteristics of scatterers and atmospheric motions.

  7. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  8. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  9. On the Contrastive Analysis of Features in Second Language Acquisition: Uninterpretable Gender on Past Participles in English-French Processing

    ERIC Educational Resources Information Center

    Dekydtspotter, Laurent; Renaud, Claire

    2009-01-01

    Lardiere's discussion raises important questions about the use of features in second language (L2) acquisition. This response examines predictions for processing of a feature-valuing model vs. a frequency-sensitive, associative model in explaining the acquisition of French past participle agreement. Results from a reading-time experiment support…

  10. Optoelectronic/image processing module for enhanced fringe pattern acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Dymny, Grzegorz; Kujawinska, Malgorzata

    1996-08-01

    The paper introduces an optoelectronic/image processing module, OIMP, which enables more convenient implementation of full-field optical methods of testing into industry. OIMP consist of two miniature CCD cameras and optical wavefront modification system which recombines the beams produced by opto-mechanical measurement system and images fringe patterns on the CCD matrices. The modules makes possible simultaneous registration of there monochromatic images as R,G,B components of color video signal by means of signal frame grabber or by VCR on video tape. This enables convenient and inexpensive storage of large quantities of data which may be analyzed by spatial carrier phase shifting method of automatic fringe pattern analysis. THe usefulness of OIMP is shown by two examples: u and v in-plane displacement simultaneous analysis in grating interferometry system and complex shape determination by fringe projection systems.

  11. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalised cross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  12. Acquisition and Analysis of Dynamic Responses of a Historic Pedestrian Bridge using Video Image Processing

    NASA Astrophysics Data System (ADS)

    O'Byrne, Michael; Ghosh, Bidisha; Schoefs, Franck; O'Donnell, Deirdre; Wright, Robert; Pakrashi, Vikram

    2015-07-01

    Video based tracking is capable of analysing bridge vibrations that are characterised by large amplitudes and low frequencies. This paper presents the use of video images and associated image processing techniques to obtain the dynamic response of a pedestrian suspension bridge in Cork, Ireland. This historic structure is one of the four suspension bridges in Ireland and is notable for its dynamic nature. A video camera is mounted on the river-bank and the dynamic responses of the bridge have been measured from the video images. The dynamic response is assessed without the need of a reflector on the bridge and in the presence of various forms of luminous complexities in the video image scenes. Vertical deformations of the bridge were measured in this regard. The video image tracking for the measurement of dynamic responses of the bridge were based on correlating patches in time-lagged scenes in video images and utilisinga zero mean normalisedcross correlation (ZNCC) metric. The bridge was excited by designed pedestrian movement and by individual cyclists traversing the bridge. The time series data of dynamic displacement responses of the bridge were analysedto obtain the frequency domain response. Frequencies obtained from video analysis were checked against accelerometer data from the bridge obtained while carrying out the same set of experiments used for video image based recognition.

  13. Acquisition and processing pitfall with clipped traces in surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Gao, Lingli; Pan, Yudi

    2016-02-01

    Multichannel analysis of surface waves (MASW) is widely used in estimating near-surface shear (S)-wave velocity. In the MASW method, generating a reliable dispersion image in the frequency-velocity (f-v) domain is an important processing step. A locus along peaks of dispersion energy at different frequencies allows the dispersion curves to be constructed for inversion. When the offsets are short, the output seismic data may exceed the dynamic ranges of geophones/seismograph, as a result of which, peaks and (or) troughs of traces will be squared off in recorded shot gathers. Dispersion images generated by the raw shot gathers with clipped traces would be contaminated by artifacts, which might be misidentified as Rayleigh-wave phase velocities or body-wave velocities and potentially lead to incorrect results. We performed some synthetic models containing clipped traces, and analyzed amplitude spectra of unclipped and clipped waves. The results indicate that artifacts in the dispersion image are dependent on the level of clipping. A real-world example also shows how clipped traces would affect the dispersion image. All the results suggest that clipped traces should be removed from the shot gathers before generating dispersion images, in order to pick accurate phase velocities and set reasonable initial inversion models.

  14. eL-Chem Viewer: A Freeware Package for the Analysis of Electroanalytical Data and Their Post-Acquisition Processing

    PubMed Central

    Hrbac, Jan; Halouzka, Vladimir; Trnkova, Libuse; Vacek, Jan

    2014-01-01

    In electrochemical sensing, a number of voltammetric or amperometric curves are obtained which are subsequently processed, typically by evaluating peak currents and peak potentials or wave heights and half-wave potentials, frequently after background correction. Transformations of voltammetric data can help to extract specific information, e.g., the number of transferred electrons, and can reveal aspects of the studied electrochemical system, e.g., the contribution of adsorption phenomena. In this communication, we introduce a LabView-based software package, ‘eL-Chem Viewer’, which is for the analysis of voltammetric and amperometric data, and enables their post-acquisition processing using semiderivative, semiintegral, derivative, integral and elimination procedures. The software supports the single-click transfer of peak/wave current and potential data to spreadsheet software, a feature that greatly improves productivity when constructing calibration curves, trumpet plots and performing similar tasks. eL-Chem Viewer is freeware and can be downloaded from www.lchem.cz/elchemviewer.htm. PMID:25090415

  15. Personal computer process data acquisition

    SciTech Connect

    Dworjanyn, L.O. )

    1989-01-01

    A simple Basic program was written to permit personal computer data collection of process temperatures, pressures, flows, and inline analyzer outputs for a batch-type, unit operation. The field voltage outputs were read on a IEEE programmable digital multimeter using a programmable scanner to select different output lines. The data were stored in ASCII format to allow direct analysis by spreadsheet programs. 1 fig., 1 tab.

  16. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  17. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  18. Processability Theory and German Case Acquisition

    ERIC Educational Resources Information Center

    Baten, Kristof

    2011-01-01

    This article represents the first attempt to formulate a hypothetical sequence for German case acquisition by Dutch-speaking learners on the basis of Processability Theory (PT). It will be argued that case forms emerge corresponding to a development from lexical over phrasal to interphrasal morphemes. This development, however, is subject to a…

  19. The Effectiveness of Processing Instruction and Production-Based Instruction on L2 Grammar Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2015-01-01

    This article reports a meta-analysis of 42 experiments in 33 published studies involving processing instruction (PI) and production-based instruction (PB) used in the PI studies. The comparative effectiveness of PI and PB showed that although PI was more effective than PB for developing receptive knowledge, PB was just as effective as PI for…

  20. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  1. Major system acquisitions process (A-109)

    NASA Technical Reports Server (NTRS)

    Saric, C.

    1991-01-01

    The Major System examined is a combination of elements (hardware, software, facilities, and services) that function together to produce capabilities required to fulfill a mission need. The system acquisition process is a sequence of activities beginning with documentation of mission need and ending with introduction of major system into operational use or otherwise successful achievement of program objectives. It is concluded that the A-109 process makes sense and provides a systematic, integrated management approach along with appropriate management level involvement and innovative and 'best ideas' from private sector in satisfying mission needs.

  2. Global Transcriptome Analysis Reveals Acclimation-Primed Processes Involved in the Acquisition of Desiccation Tolerance in Boea hygrometrica.

    PubMed

    Zhu, Yan; Wang, Bo; Phillips, Jonathan; Zhang, Zhen-Nan; Du, Hong; Xu, Tao; Huang, Lian-Cheng; Zhang, Xiao-Fei; Xu, Guang-Hui; Li, Wen-Long; Wang, Zhi; Wang, Ling; Liu, Yong-Xiu; Deng, Xin

    2015-07-01

    Boea hygrometrica resurrection plants require a period of acclimation by slow soil-drying in order to survive a subsequent period of rapid desiccation. The molecular basis of this observation was investigated by comparing gene expression profiles under different degrees of water deprivation. Transcripts were clustered according to the expression profiles in plants that were air-dried (rapid desiccation), soil-dried (gradual desiccation), rehydrated (acclimated) and air-dried after acclimation. Although phenotypically indistinguishable, it was shown by principal component analysis that the gene expression profiles in rehydrated, acclimated plants resemble those of desiccated plants more closely than those of hydrated acclimated plants. Enrichment analysis based on gene ontology was performed to deconvolute the processes that accompanied desiccation tolerance. Transcripts associated with autophagy and α-tocopherol accumulation were found to be activated in both air-dried, acclimated plants and soil-dried non-acclimated plants. Furthermore, transcripts associated with biosynthesis of ascorbic acid, cell wall catabolism, chaperone-assisted protein folding, respiration and macromolecule catabolism were activated and maintained during soil-drying and rehydration. Based on these findings, we hypothesize that activation of these processes leads to the establishment of an optimal physiological and cellular state that enables tolerance during rapid air-drying. Our study provides a novel insight into the transcriptional regulation of critical priming responses to enable survival following rapid dehydration in B. hygrometrica. PMID:25907569

  3. Commonality analysis as a knowledge acquisition problem

    NASA Technical Reports Server (NTRS)

    Yeager, Dorian P.

    1987-01-01

    Commonality analysis is a systematic attempt to reduce costs in a large scale engineering project by discontinuing development of certain components during the design phase. Each discontinued component is replaced by another component that has sufficient functionality to be considered an appropriate substitute. The replacement strategy is driven by economic considerations. The System Commonality Analysis Tool (SCAT) is based on an oversimplified model of the problem and incorporates no knowledge acquisition component. In fact, the process of arriving at a compromise between functionality and economy is quite complex, with many opportunities for the application of expert knowledge. Such knowledge is of two types: general knowledge expressible as heuristics or mathematical laws potentially applicable to any set of components, and specific knowledge about the way in which elements of a given set of components interrelate. Examples of both types of knowledge are presented, and a framework is proposed for integrating the knowledge into a more general and useable tool.

  4. 28. Perimeter acquisition radar building room #302, signal process and ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    28. Perimeter acquisition radar building room #302, signal process and analog receiver room - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  5. CAT-scan analysis in scientific drilling: effective routine data acquisition and processing of whole cores, split cores and u-channels

    NASA Astrophysics Data System (ADS)

    St-Onge, G.; Francus, P.; Labrie, J.; Beauvais, Q.; Velle, J. H.; Fortin, D.; Mix, A. C.; Jaeger, J. M.; Stoner, J. S.; Bahlburg, H.; Forwick, M.; Zolitschka, B.

    2014-12-01

    CAT-scan analysis of sediment cores provides a rapid, high-resolution and non destructive method to visualise sedimentary structures, coring-induced artefacts, as well as to derive a continuous downcore CT number profile primarily associated with changes in bulk density. Here, we will briefly overview how we now routinely use CAT-scan analysis for paleoenvironmental and sedimentological purposes. We will present some advances in data processing, as well as a few case studies from lacustrine and marine sedimentary sequences measured using either whole cores, split cores and u-channels in order to highlight the advantages and complementarity of CAT-Scan measurements with other continuous downcore high-resolution physical or magnetic measurements. We will also illustrate how effective data acquisition and processing have now enabled the use of CAT-scan for the continuous interpretation of long drilled sequences from IODP (Exp. 341 - Gulf of Alaska) and ICDP (PASADO - Laguna Potrok Aike, Southern Patagonia) previously hampered by the large number of core sections and derived images.

  6. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  7. Guidelines for dynamic data acquisition and analysis

    NASA Technical Reports Server (NTRS)

    Piersol, Allan G.

    1992-01-01

    The recommendations concerning pyroshock data presented in the final draft of a proposed military handbook on Guidelines for Dynamic Data Acquisition and Analysis are reviewed. The structural responses produced by pyroshocks are considered to be one of the most difficult types of dynamic data to accurately measure and analyze.

  8. Guidelines for dynamic data acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Piersol, Allan G.

    1992-10-01

    The recommendations concerning pyroshock data presented in the final draft of a proposed military handbook on Guidelines for Dynamic Data Acquisition and Analysis are reviewed. The structural responses produced by pyroshocks are considered to be one of the most difficult types of dynamic data to accurately measure and analyze.

  9. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  10. Acquisition and analysis of accelerometer data

    NASA Technical Reports Server (NTRS)

    Verges, Keith R.

    1990-01-01

    Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.

  11. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  12. TARA control, data acquisition and analysis system

    SciTech Connect

    Gaudreau, M.P.J.; Sullivan, J.D.; Fredian, T.W.; Karcher, C.A.; Sevillano, E.; Stillerman, J.; Thomas, P.

    1983-12-01

    The MIT tandem mirror (TARA) control, data acquisition and analysis system consists of two major parts: (1) a Gould 584 industrial programmable controller (PC) to control engineering functions; and (2) a VAX 11/750 based data acquisition and analysis system for physics analysis. The PC is designed for use in harsh industrial environments and has proven to be a reliable and cost-effective means for automated control. The PC configuration is dedicated to control tasks on the TARA magnet, vacuum, RF, neutral beam, diagnostics, and utility systems. The data transfer functions are used to download system operating parameters from menu selectable tables. Real time status reports are provided to video terminals and as blocks of data to the host computer for storage. The data acquisition and analysis system for TARA was designed to provide high throughput and ready access to data from earlier runs. The adopted design uses pre-existing software packages in a system which is simple, coherent, fast, and which requires a minimum of new software development. The computer configuration is a VAX 11/750 running VMS with 124 M byte massbus disk and 1.4 G byte unibus disk subsystem.

  13. Second Language Vocabulary Acquisition: A Lexical Input Processing Approach

    ERIC Educational Resources Information Center

    Barcroft, Joe

    2004-01-01

    This article discusses the importance of vocabulary in second language acquisition (SLA), presents an overview of major strands of research on vocabulary acquisition, and discusses five principles for effective second language (L2) vocabulary instruction based on research findings on lexical input processing. These principles emphasize…

  14. System of acquisition and processing of images of dynamic speckle

    NASA Astrophysics Data System (ADS)

    Vega, F.; >C Torres,

    2015-01-01

    In this paper we show the design and implementation of a system to capture and analysis of dynamic speckle. The device consists of a USB camera, an isolated system lights for imaging, a laser pointer 633 nm 10 mw as coherent light source, a diffuser and a laptop for processing video. The equipment enables the acquisition and storage of video, also calculated of different descriptors of statistical analysis (vector global accumulation of activity, activity matrix accumulation, cross-correlation vector, autocorrelation coefficient, matrix Fujji etc.). The equipment is designed so that it can be taken directly to the site where the sample for biological study and is currently being used in research projects within the group.

  15. ISON Data Acquisition and Analysis Software

    NASA Astrophysics Data System (ADS)

    Kouprianov, Vladimir

    2013-08-01

    Since the first days of the ISON project, its success was strongly based on using advanced data analysis techniques and their implementation in software. Space debris studies and space surveillance in optical are very unique from the point of view of observation techniques and thus infer extremely specific requirements on sensor design and control and on initial data analysis, dictated mostly by fast apparent motion of space objects being studied. From the point of view of data acquisition and analysis software, this implies support for sophisticated scheduling, complex tracking, accurate timing, large fields of view, and undersampled CCD images with trailed sources. Here we present the historical outline, major goals and design concepts of the standard ISON data acquisition and analysis packages, and how they meet these requirements. Among these packages, the most important are: CHAOS telescope control system (TCS), its recent successor FORTE, and Apex II ‒ a platform for astronomical image analysis with focus on high-precision astrometry and photometry of fast-moving objects and transient phenomena. Development of these packages is supported by ISON, and they are now responsible for most of the raw data produced by the network. They are installed on nearly all sensors and are available to all participants of the ISON collaboration.

  16. Isothermal thermogravimetric data acquisition analysis system

    NASA Technical Reports Server (NTRS)

    Cooper, Kenneth, Jr.

    1991-01-01

    The description of an Isothermal Thermogravimetric Analysis (TGA) Data Acquisition System is presented. The system consists of software and hardware to perform a wide variety of TGA experiments. The software is written in ANSI C using Borland's Turbo C++. The hardware consists of a 486/25 MHz machine with a Capital Equipment Corp. IEEE488 interface card. The interface is to a Hewlett Packard 3497A data acquisition system using two analog input cards and a digital actuator card. The system provides for 16 TGA rigs with weight and temperature measurements from each rig. Data collection is conducted in three phases. Acquisition is done at a rapid rate during initial startup, at a slower rate during extended data collection periods, and finally at a fast rate during shutdown. Parameters controlling the rate and duration of each phase are user programmable. Furnace control (raising and lowering) is also programmable. Provision is made for automatic restart in the event of power failure or other abnormal terminations. Initial trial runs were conducted to show system stability.

  17. Towards a Platform for Image Acquisition and Processing on RASTA

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele

    2013-08-01

    This paper presents the architecture of a platform for image acquisition and processing based on commercial hardware and space qualified hardware. The aim is to extend the Reference Architecture Test-bed for Avionics (RASTA) system in order to obtain a Test-bed that allows testing different hardware and software solutions in the field of image acquisition and processing. The platform will allow the integration of space qualified hardware and Commercial Off The Shelf (COTS) hardware in order to test different architectural configurations. The first implementation is being performed on a low cost commercial board and on the GR712RC board based on the Dual Core Leon3 fault tolerant processor. The platform will include an actuation module with the aim of implementing a complete pipeline from image acquisition to actuation, making possible the simulation of a real case scenario involving acquisition and actuation.

  18. Auditory Processing Disorders: Acquisition and Treatment

    ERIC Educational Resources Information Center

    Moore, David R.

    2007-01-01

    Auditory processing disorder (APD) describes a mixed and poorly understood listening problem characterised by poor speech perception, especially in challenging environments. APD may include an inherited component, and this may be major, but studies reviewed here of children with long-term otitis media with effusion (OME) provide strong evidence…

  19. 75 FR 62069 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... Federal Acquisition Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General... criteria that an agency must address in a waiver request and a waiver consultation process regarding... Operations in Sudan and Imports from Burma, in the Federal Register at 74 FR 40463 on August 11,...

  20. Optical image acquisition system for colony analysis

    NASA Astrophysics Data System (ADS)

    Wang, Weixing; Jin, Wenbiao

    2006-02-01

    For counting of both colonies and plaques, there is a large number of applications including food, dairy, beverages, hygiene, environmental monitoring, water, toxicology, sterility testing, AMES testing, pharmaceuticals, paints, sterile fluids and fungal contamination. Recently, many researchers and developers have made efforts for this kind of systems. By investigation, some existing systems have some problems since they belong to a new technology product. One of the main problems is image acquisition. In order to acquire colony images with good quality, an illumination box was constructed as: the box includes front lightning and back lightning, which can be selected by users based on properties of colony dishes. With the illumination box, lightning can be uniform; colony dish can be put in the same place every time, which make image processing easy. A digital camera in the top of the box connected to a PC computer with a USB cable, all the camera functions are controlled by the computer.

  1. Reading Acquisition Enhances an Early Visual Process of Contour Integration

    ERIC Educational Resources Information Center

    Szwed, Marcin; Ventura, Paulo; Querido, Luis; Cohen, Laurent; Dehaene, Stanislas

    2012-01-01

    The acquisition of reading has an extensive impact on the developing brain and leads to enhanced abilities in phonological processing and visual letter perception. Could this expertise also extend to early visual abilities outside the reading domain? Here we studied the performance of illiterate, ex-illiterate and literate adults closely matched…

  2. Low Cost Coherent Doppler Lidar Data Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Barnes, Bruce W.; Koch, Grady J.

    2003-01-01

    The work described in this paper details the development of a low-cost, short-development time data acquisition and processing system for a coherent Doppler lidar. This was done using common laboratory equipment and a small software investment. This system provides near real-time wind profile measurements. Coding flexibility created a very useful test bed for new techniques.

  3. Understanding the knowledge acquisition process about Earth and Space concepts

    NASA Astrophysics Data System (ADS)

    Frappart, Soren

    There exist two main theoretical views concerning the knowledge acquisition process in science. Those views are still in debate in the literature. On the one hand, knowledge is considered to be organized into coherent wholes (mental models). On the other hand knowledge is described as fragmented sets with no link between the fragments. Mental models have a predictive and explicative power and are constrained by universal presuppositions. They follow a universal gradual development in three steps from initial, synthetic to scientific models. On the contrary, the fragments are not organised and development is seen as a situated process where cultural transmission plays a fundamental role. After a presentation of those two theoretical positions, we will illustrate them with examples of studies related to the Earth Shape and gravity performed in different cultural contexts in order to enhance both the differences and the invariant cultural elements. We will show how those problematic are important to take into account and to question for space concepts, like gravity, orbits, weightlessness for instance. Indeed capturing the processes of acquisition and development of knowledge concerning specific space concepts can give us important information to develop relevant and adapted strategies for instruction. If the process of knowledge acquisition for Space concepts is fragmented then we have to think of how we could identify those fragments and help the learner organise links between them. If the knowledge is organised into coherent mental models, we have to think of how to destabilize a non relevant model and to prevent from the development of initial and synthetic models. Moreover the question of what is universal versus what is culture dependant in this acquisition process need to be explored. We will also present some main misconceptions that appeared about Space concepts. Indeed, additionally to the previous theoretical consideration, the collection and awareness of

  4. FPGA Based Data Acquisition and Processing for Gamma Ray Tomography

    NASA Astrophysics Data System (ADS)

    Schlaberg, H. Inaki; Li, Donghui; Wu, Yingxiang; Wang, Mi

    2007-06-01

    Data acquisition and processing for gamma ray tomography has traditionally been performed with analogue electronic circuitry. Detectors convert the received photons into electrical signals which are then shaped and conditioned for the next counting stage. An approach of using a FPGA (Field programmable gate array) based data acquisition and processing system for gamma ray tomography is presented in this paper. With recently introduced low cost high speed analogue to digital converters and digital signal processors the electrical output of the detectors can be converted into the digital domain with only simple analogue signal conditioning. This step can significantly reduce the amount of components and the size of the instrument as much of the analogue processing circuitry is eliminated. To count the number of incident photons from the converted electrical signal, a peak detection algorithm can be developed for the DSP (Digital Signal Processor). However due to the relatively high sample rate the consequently low number of available of processor cycles to process the sample makes it more effective to implement a peak detection algorithm on the FPGA. This paper presents the development of the acquisition system hardware and simulation results of the peak detection with previously recorded experimental data on a flow loop.

  5. Multi-Channel Data Acquisition System for Nuclear Pulse Processing

    SciTech Connect

    Myjak, Mitchell J.; Ma, Ding; Robinson, Dirk J.; La Rue, George S.

    2009-11-13

    We have developed a compact, inexpensive electronics package that can digitize pulse-mode or current-mode data from 32 detector outputs in parallel. The electronics package consists of two circuit boards: a custom acquisition board and an off-the-shelf processing board. The acquisition board features a custom-designed integrated circuit that contains an array of charge-to-pulse-width converters. The processing board contains a field programmable gate array that digitizes the pulse widths, performs event discrimination, constructs energy histograms, and executes any user-defined software. Together, the two boards cost around $1000. The module can transfer data to a computer or operate entirely as a standalone system. The design achieves 0.20% nonlinearity and 0.18% FWHM precision at full scale. However, the overall performance could be improved with some modifications to the integrated circuit.

  6. Advances in GPR data acquisition and analysis for archaeology

    NASA Astrophysics Data System (ADS)

    Zhao, Wenke; Tian, Gang; Forte, Emanuele; Pipan, Michele; Wang, Yimin; Li, Xuejing; Shi, Zhanjie; Liu, Haiyan

    2015-07-01

    The primary objective of this study is to evaluate the applicability and the effectiveness of ground-penetrating radar (GPR) to identify a thin burnt soil layer, buried more than 2 m below the topographic surface at the Liangzhu Site, in Southeastern China. The site was chosen for its relatively challenging conditions of GPR techniques due to electrical conductivity and to the presence of peach tree roots that produced scattering. We completed the data acquisition by using 100 and 200 MHz antennas in TE and TM broadside and cross-polarized configurations. In the data processing and interpretation phase, we used GPR attribute analysis, including instantaneous phase and geometrical attributes. Validation analysis ground-truthing performed after the geophysical surveys, validated the GPR imaging, confirmed the electrical conductivity and relative dielectric permittivity (RDP) measurements performed at different depths, and allowed a reliable quantitative correlation between GPR results and subsurface physical properties. The research demonstrates that multiple antenna configurations in GPR data acquisition combined with attribute analysis can enhance the ability to characterize prehistoric archaeological remains even in complex subsurface conditions.

  7. Mosaic acquisition and processing for optical-resolution photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Shao, Peng; Shi, Wei; Chee, Ryan K. W.; Zemp, Roger J.

    2012-08-01

    In optical-resolution photo-acoustic microscopy (OR-PAM), data acquisition time is limited by both laser pulse repetition rate (PRR) and scanning speed. Optical-scanning offers high speed, but limited, field of view determined by ultrasound transducer sensitivity. In this paper, we propose a hybrid optical and mechanical-scanning OR-PAM system with mosaic data acquisition and processing. The system employs fast-scanning mirrors and a diode-pumped, nanosecond-pulsed, Ytterbium-doped, 532-nm fiber laser with PRR up to 600 kHz. Data from a sequence of image mosaic patches is acquired systematically, at predetermined mechanical scanning locations, with optical scanning. After all imaging locations are covered, a large panoramic scene is generated by stitching the mosaic patches together. Our proposed system is proven to be at least 20 times faster than previous reported OR-PAM systems.

  8. The acquisition of integrated science process skills in a web-based learning environment

    NASA Astrophysics Data System (ADS)

    Saat, Rohaida Mohd.

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among grade 5 children. Data were gathered primarily from children's conversations and teacher-student conversations. Analysis of the data revealed that the children acquired the skill in three phases: from the phase of recognition to the phase of familiarization and finally to the phase of automation. Nevertheless, the acquisition of the skill only involved the acquisition of certain subskills of the skill of controlling variables. This progression could be influenced by the web-based instructional material that provided declarative knowledge, concrete visualization and opportunities for practise.

  9. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use...

  10. NOVA-NREL Optimal Vehicle Acquisition Analysis (Brochure)

    SciTech Connect

    Blakley, H.

    2011-03-01

    Federal fleet managers face unique challenges in accomplishing their mission - meeting agency transportation needs while complying with Federal goals and mandates. Included in these challenges are a variety of statutory requirements, executive orders, and internal goals and objectives that typically focus on petroleum consumption and greenhouse gas (GHG) emissions reductions, alternative fuel vehicle (AFV) acquisitions, and alternative fuel use increases. Given the large number of mandates affecting Federal fleets and the challenges faced by all fleet managers in executing day-to-day operations, a primary challenge for agencies and other organizations is ensuring that they are as efficient as possible in using constrained fleet budgets. An NREL Optimal Vehicle Acquisition (NOVA) analysis makes use of a mathematical model with a variety of fleet-related data to create an optimal vehicle acquisition strategy for a given goal, such as petroleum or GHG reduction. The analysis can helps fleets develop a vehicle acquisition strategy that maximizes petroleum and greenhouse gas reductions.

  11. The analysis of image acquisition in LabVIEW

    NASA Astrophysics Data System (ADS)

    Xu, Wuni; Zhong, Lanxiang

    2011-06-01

    In this paper, four methods of image acquisition in LabVIEW were described, and its realization principles and the procedures in combination with different hardware architectures were illustrated in the virtual instrument laboratory. Experiment results show that the methods of image acquisition in LabVIEW have many advantages such as easier configuration, lower complexity and stronger practicability than in VB and C++. Thus the methods are fitter to set the foundation for image processing, machine vision, pattern recognition research.

  12. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  13. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... published a proposed rule in the Federal Register at 77 FR 40552 on July 10, 2012, to clarify and pinpoint a... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a...

  14. Development of data acquisition and analysis software for multichannel detectors

    SciTech Connect

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs.

  15. Mobile digital data acquisition and recording system for geoenergy process monitoring and control

    SciTech Connect

    Kimball, K B; Ogden, H C

    1980-12-01

    Three mobile, general purpose data acquisition and recording systems have been built to support geoenergy field experiments. These systems were designed to record and display information from large assortments of sensors used to monitor in-situ combustion recovery or similar experiments. They provide experimenters and operations personnel with easy access to current and past data for evaluation and control of the process, and provide permanent recordings for subsequent detailed analysis. The configurations of these systems and their current capabilities are briefly described.

  16. MS1 Peptide Ion Intensity Chromatograms in MS2 (SWATH) Data Independent Acquisitions. Improving Post Acquisition Analysis of Proteomic Experiments*

    PubMed Central

    Rardin, Matthew J.; Schilling, Birgit; Cheng, Lin-Yang; MacLean, Brendan X.; Sorensen, Dylan J.; Sahu, Alexandria K.; MacCoss, Michael J.; Vitek, Olga; Gibson, Bradford W.

    2015-01-01

    Quantitative analysis of discovery-based proteomic workflows now relies on high-throughput large-scale methods for identification and quantitation of proteins and post-translational modifications. Advancements in label-free quantitative techniques, using either data-dependent or data-independent mass spectrometric acquisitions, have coincided with improved instrumentation featuring greater precision, increased mass accuracy, and faster scan speeds. We recently reported on a new quantitative method called MS1 Filtering (Schilling et al. (2012) Mol. Cell. Proteomics 11, 202–214) for processing data-independent MS1 ion intensity chromatograms from peptide analytes using the Skyline software platform. In contrast, data-independent acquisitions from MS2 scans, or SWATH, can quantify all fragment ion intensities when reference spectra are available. As each SWATH acquisition cycle typically contains an MS1 scan, these two independent label-free quantitative approaches can be acquired in a single experiment. Here, we have expanded the capability of Skyline to extract both MS1 and MS2 ion intensity chromatograms from a single SWATH data-independent acquisition in an Integrated Dual Scan Analysis approach. The performance of both MS1 and MS2 data was examined in simple and complex samples using standard concentration curves. Cases of interferences in MS1 and MS2 ion intensity data were assessed, as were the differentiation and quantitation of phosphopeptide isomers in MS2 scan data. In addition, we demonstrated an approach for optimization of SWATH m/z window sizes to reduce interferences using MS1 scans as a guide. Finally, a correlation analysis was performed on both MS1 and MS2 ion intensity data obtained from SWATH acquisitions on a complex mixture using a linear model that automatically removes signals containing interferences. This work demonstrates the practical advantages of properly acquiring and processing MS1 precursor data in addition to MS2 fragment ion

  17. Optical nanoscopy: from acquisition to analysis.

    PubMed

    Gould, Travis J; Hess, Samuel T; Bewersdorf, Joerg

    2012-01-01

    Recent advances in far-field microscopy have demonstrated that fluorescence imaging is possible at resolutions well below the long-standing diffraction limit. By exploiting photophysical properties of fluorescent probe molecules, this new class of methods yields a resolving power that is fundamentally diffraction unlimited. Although these methods are becoming more widely used in biological imaging, they must be complemented by suitable data analysis approaches if their potential is to be fully realized. Here we review the basic principles of diffraction-unlimited microscopy and how these principles influence the selection of available algorithms for data analysis. Furthermore, we provide an overview of existing analysis strategies and discuss their application. PMID:22559319

  18. KAM (Knowledge Acquisition Module): A tool to simplify the knowledge acquisition process

    NASA Technical Reports Server (NTRS)

    Gettig, Gary A.

    1988-01-01

    Analysts, knowledge engineers and information specialists are faced with increasing volumes of time-sensitive data in text form, either as free text or highly structured text records. Rapid access to the relevant data in these sources is essential. However, due to the volume and organization of the contents, and limitations of human memory and association, frequently: (1) important information is not located in time; (2) reams of irrelevant data are searched; and (3) interesting or critical associations are missed due to physical or temporal gaps involved in working with large files. The Knowledge Acquisition Module (KAM) is a microcomputer-based expert system designed to assist knowledge engineers, analysts, and other specialists in extracting useful knowledge from large volumes of digitized text and text-based files. KAM formulates non-explicit, ambiguous, or vague relations, rules, and facts into a manageable and consistent formal code. A library of system rules or heuristics is maintained to control the extraction of rules, relations, assertions, and other patterns from the text. These heuristics can be added, deleted or customized by the user. The user can further control the extraction process with optional topic specifications. This allows the user to cluster extracts based on specific topics. Because KAM formalizes diverse knowledge, it can be used by a variety of expert systems and automated reasoning applications. KAM can also perform important roles in computer-assisted training and skill development. Current research efforts include the applicability of neural networks to aid in the extraction process and the conversion of these extracts into standard formats.

  19. Data acquisition and analysis for the energy-subtraction Compton scatter camera for medical imaging

    NASA Astrophysics Data System (ADS)

    Khamzin, Murat Kamilevich

    In response to the shortcomings of the Anger camera currently being used in conventional SPECT, particularly the trade-off between sensitivity and spatial resolution, a novel energy-subtraction Compton scatter camera, or the ESCSC, has been proposed. A successful clinical implementation of the ESCSC could revolutionize the field of SPECT. Features of this camera include utilization of silicon and CdZnTe detectors in primary and secondary detector systems, list-mode time stamping data acquisition, modular architecture, and post-acquisition data analysis. Previous ESCSC studies were based on Monte Carlo modeling. The objective of this work is to test the theoretical framework developed in previous studies by developing the data acquisition and analysis techniques necessary to implement the ESCSC. The camera model working in list-mode with time stamping was successfully built and tested thus confirming potential of the ESCSC that was predicted in previous simulation studies. The obtained data were processed during the post-acquisition data analysis based on preferred event selection criteria. Along with the construction of a camera model and proving the approach, the post-acquisition data analysis was further extended to include preferred event weighting based on the likelihood of a preferred event to be a true preferred event. While formulated to show ESCSC capabilities, the results of this study are important for any Compton scatter camera implementation as well as for coincidence data acquisition systems in general.

  20. The Logical Syntax of Number Words: Theory, Acquisition and Processing

    ERIC Educational Resources Information Center

    Musolino, Julien

    2009-01-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. "Cognition 93", 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar…

  1. The Wayside Mechanic: An Analysis of Skill Acquisition in Ghana.

    ERIC Educational Resources Information Center

    McLaughlin, Stephen Douglas

    This study describes and analyzes the nature of skill acquisition process in one indigenous, informal training system--the apprenticeship of the wayside mechanics workshops in Koforidua, Ghana. Chapter 2 places apprenticeships training in the wider context of artisanship and training. It traces the history of the West African craft shop and its…

  2. Multibeam Sonar Backscatter Data Acquisition and Processing: Guidelines and Recommendations from the GEOHAB Backscatter Working Group

    NASA Astrophysics Data System (ADS)

    Heffron, E.; Lurton, X.; Lamarche, G.; Brown, C.; Lucieer, V.; Rice, G.; Schimel, A.; Weber, T.

    2015-12-01

    Backscatter data acquired with multibeam sonars are now commonly used for the remote geological interpretation of the seabed. The systems hardware, software, and processing methods and tools have grown in numbers and improved over the years, yet many issues linger: there are no standard procedures for acquisition, poor or absent calibration, limited understanding and documentation of processing methods, etc. A workshop organized at the GeoHab (a community of geoscientists and biologists around the topic of marine habitat mapping) annual meeting in 2013 was dedicated to seafloor backscatter data from multibeam sonars and concluded that there was an overwhelming need for better coherence and agreement on the topics of acquisition, processing and interpretation of data. The GeoHab Backscatter Working Group (BSWG) was subsequently created with the purpose of documenting and synthetizing the state-of-the-art in sensors and techniques available today and proposing methods for best practice in the acquisition and processing of backscatter data. Two years later, the resulting document "Backscatter measurements by seafloor-mapping sonars: Guidelines and Recommendations" was completed1. The document provides: An introduction to backscatter measurements by seafloor-mapping sonars; A background on the physical principles of sonar backscatter; A discussion on users' needs from a wide spectrum of community end-users; A review on backscatter measurement; An analysis of best practices in data acquisition; A review of data processing principles with details on present software implementation; and finally A synthesis and key recommendations. This presentation reviews the BSWG mandate, structure, and development of this document. It details the various chapter contents, its recommendations to sonar manufacturers, operators, data processing software developers and end-users and its implication for the marine geology community. 1: Downloadable at https://www.niwa.co.nz/coasts-and-oceans/research-projects/backscatter-measurement-guidelines

  3. FABIA: factor analysis for bicluster acquisition

    PubMed Central

    Hochreiter, Sepp; Bodenhofer, Ulrich; Heusel, Martin; Mayr, Andreas; Mitterecker, Andreas; Kasim, Adetayo; Khamiakova, Tatsiana; Van Sanden, Suzy; Lin, Dan; Talloen, Willem; Bijnens, Luc; Göhlmann, Hinrich W. H.; Shkedy, Ziv; Clevert, Djork-Arné

    2010-01-01

    Motivation: Biclustering of transcriptomic data groups genes and samples simultaneously. It is emerging as a standard tool for extracting knowledge from gene expression measurements. We propose a novel generative approach for biclustering called ‘FABIA: Factor Analysis for Bicluster Acquisition’. FABIA is based on a multiplicative model, which accounts for linear dependencies between gene expression and conditions, and also captures heavy-tailed distributions as observed in real-world transcriptomic data. The generative framework allows to utilize well-founded model selection methods and to apply Bayesian techniques. Results: On 100 simulated datasets with known true, artificially implanted biclusters, FABIA clearly outperformed all 11 competitors. On these datasets, FABIA was able to separate spurious biclusters from true biclusters by ranking biclusters according to their information content. FABIA was tested on three microarray datasets with known subclusters, where it was two times the best and once the second best method among the compared biclustering approaches. Availability: FABIA is available as an R package on Bioconductor (http://www.bioconductor.org). All datasets, results and software are available at http://www.bioinf.jku.at/software/fabia/fabia.html Contact: hochreit@bioinf.jku.at Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20418340

  4. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND... not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  5. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND... not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  6. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, J.R.

    1997-02-11

    A method and apparatus are disclosed for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register. 15 figs.

  7. Method and apparatus for high speed data acquisition and processing

    DOEpatents

    Ferron, John R.

    1997-01-01

    A method and apparatus for high speed digital data acquisition. The apparatus includes one or more multiplexers for receiving multiple channels of digital data at a low data rate and asserting a multiplexed data stream at a high data rate, and one or more FIFO memories for receiving data from the multiplexers and asserting the data to a real time processor. Preferably, the invention includes two multiplexers, two FIFO memories, and a 64-bit bus connecting the FIFO memories with the processor. Each multiplexer receives four channels of 14-bit digital data at a rate of up to 5 MHz per channel, and outputs a data stream to one of the FIFO memories at a rate of 20 MHz. The FIFO memories assert output data in parallel to the 64-bit bus, thus transferring 14-bit data values to the processor at a combined rate of 40 MHz. The real time processor is preferably a floating-point processor which processes 32-bit floating-point words. A set of mask bits is prestored in each 32-bit storage location of the processor memory into which a 14-bit data value is to be written. After data transfer from the FIFO memories, mask bits are concatenated with each stored 14-bit data value to define a valid 32-bit floating-point word. Preferably, a user can select any of several modes for starting and stopping direct memory transfers of data from the FIFO memories to memory within the real time processor, by setting the content of a control and status register.

  8. A knowledge acquisition process to analyse operational problems in solid waste management facilities.

    PubMed

    Dokas, Ioannis M; Panagiotakopoulos, Demetrios C

    2006-08-01

    The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions. PMID:16941992

  9. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  10. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... Regulation Supplement; DoD Voucher Processing AGENCY: Defense Acquisition Regulations System, Department of Defense (DoD). ACTION: Proposed rule. SUMMARY: DoD is proposing to amend the Defense Federal Acquisition Regulation Supplement (DFARS) to update DoD's voucher processing procedures and better accommodate the use...

  11. Stable image acquisition for mobile image processing applications

    NASA Astrophysics Data System (ADS)

    Henning, Kai-Fabian; Fritze, Alexander; Gillich, Eugen; Mönks, Uwe; Lohweg, Volker

    2015-02-01

    Today, mobile devices (smartphones, tablets, etc.) are widespread and of high importance for their users. Their performance as well as versatility increases over time. This leads to the opportunity to use such devices for more specific tasks like image processing in an industrial context. For the analysis of images requirements like image quality (blur, illumination, etc.) as well as a defined relative position of the object to be inspected are crucial. Since mobile devices are handheld and used in constantly changing environments the challenge is to fulfill these requirements. We present an approach to overcome the obstacles and stabilize the image capturing process such that image analysis becomes significantly improved on mobile devices. Therefore, image processing methods are combined with sensor fusion concepts. The approach consists of three main parts. First, pose estimation methods are used to guide a user moving the device to a defined position. Second, the sensors data and the pose information are combined for relative motion estimation. Finally, the image capturing process is automated. It is triggered depending on the alignment of the device and the object as well as the image quality that can be achieved under consideration of motion and environmental effects.

  12. Developmental Stages in Receptive Grammar Acquisition: A Processability Theory Account

    ERIC Educational Resources Information Center

    Buyl, Aafke; Housen, Alex

    2015-01-01

    This study takes a new look at the topic of developmental stages in the second language (L2) acquisition of morphosyntax by analysing receptive learner data, a language mode that has hitherto received very little attention within this strand of research (for a recent and rare study, see Spinner, 2013). Looking at both the receptive and productive…

  13. DEVELOPMENT OF MARKETABLE TYPING SKILL--SENSORY PROCESSES UNDERLYING ACQUISITION.

    ERIC Educational Resources Information Center

    WEST, LEONARD J.

    THE PROJECT ATTEMPTED TO PROVIDE FURTHER DATA ON THE DOMINANT HYPOTHESIS ABOUT THE SENSORY MECHANISMS UNDERLYING SKILL ACQUISITION IN TYPEWRITING. IN SO DOING, IT PROPOSED TO FURNISH A BASIS FOR IMPORTANT CORRECTIVES TO SUCH CONVENTIONAL INSTRUCTIONAL PROCEDURES AS TOUCH TYPING. SPECIFICALLY, THE HYPOTHESIS HAS BEEN THAT KINESTHESIS IS NOT…

  14. Fault recognition depending on seismic acquisition and processing for application to geothermal exploration

    NASA Astrophysics Data System (ADS)

    Buness, H.; von Hartmann, H.; Rumpel, H.; Krawczyk, C. M.; Schulz, R.

    2011-12-01

    Fault systems offer a large potential for deep hydrothermal energy extraction. Most of the existing and planned projects rely on enhanced permeability assumed to be connected with them. Target depth of hydrothermal exploration in Germany is in the order of 3 -5 km to ensure an economic operation due to moderate temperature gradients. 3D seismics is the most appropriate geophysical method to image fault systems at these depth, but also one of the most expensive ones. It constitutes a significant part of the total project costs, so its application was (and is) discussed. Cost reduction in principle can be achieved by sparse acquisition. However, the decreased fold inevitably leads to a decreased S/N ratio. To overcome this problem, the application of the CRS (Common Reflection Surface) method has been proposed. The stacking operator of the CRS method inherently includes more traces than the conventional NMO/DMO stacking operator and hence a better S/N ratio can be achieved. We tested this approach using exiting 3D seismic datasets of the two most important hydrothermal provinces in Germany, the Upper Rhine Graben (URG) and the German Molasse Basin (GMB). To simulate a sparse acquisition, we reduced the amount of data to a quarter respectively a half and did a reprocessing of the data, including new velocity analysis and residual static corrections. In the URG, the utilization of the variance cube as basis for a horizon bound window amplitude analysis has been successful for the detection of small faults, which would hardly be recognized in seismic sections. In both regions, CRS processing undoubtedly improved the imaging of small faults in the complete as well as in the reduced versions of the datasets. However, CRS processing could not compensate the loss of resolution due to the reduction associated with the simulated sparse acquisition, and hence smaller faults became undetectable. The decision for a sparse acquisition of course depends on the scope of the survey

  15. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  16. Acquisition and Processing of Multi-Fold GPR Data for Characterization of Shallow Groundwater Systems

    NASA Astrophysics Data System (ADS)

    Bradford, J. H.

    2004-05-01

    Most ground-penetrating radar (GPR) data are acquired with a constant transmitter-receiver offset and often investigators apply little or no processing in generating a subsurface image. This mode of operation can provide useful information, but does not take full advantage of the information the GPR signal can carry. In continuous multi-offset (CMO) mode, one acquires several traces with varying source-receiver separations at each point along the survey. CMO acquisition is analogous to common-midpoint acquisition in exploration seismology and gives rise to improved subsurface characterization through three key features: 1) Processes such as stacking and velocity filtering significantly attenuate coherent and random noise resulting in subsurface images that are easier to interpret, 2) CMO data enable measurement of vertical and lateral velocity variations which leads to improved understanding of material distribution and more accurate depth estimates, and 3) CMO data enable observation of reflected wave behaviour (ie variations in amplitude and spectrum) at a common reflection point for various travel paths through the subsurface - quantification of these variations can be a valuable tool in material property characterization. Although there are a few examples in the literature, investigators rarely acquire CMO GPR data. This is, in large part, due to the fact that CMO acquisition with a single channel system is labor intensive and time consuming. At present, no multi-channel GPR systems designed for CMO acquisition are commercially available. Over the past 8 years I have designed, conducted, and processed numerous 2D and 3D CMO GPR surveys using a single channel GPR system. I have developed field procedures that enable a three man crew to acquire CMO GPR data at a rate comparable to a similar scale multi-channel seismic reflection survey. Additionally, many recent advances in signal processing developed in the oil and gas industry have yet to see significant

  17. Research and design of portable photoelectric rotary table data-acquisition and analysis system

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table as the main test tracking measurement platform, widely use in shooting range and aerospace fields. In the range of photoelectric tracking measurement system, in order to meet the photoelectric testing instruments and equipment of laboratory and field application demand, research and design the portable photoelectric rotary table data acquisition and analysis system, and introduces the FPGA device based on Xilinx company Virtex-4 series and its peripheral module of the system hardware design, and the software design of host computer in VC++ 6.0 programming platform and MFC package based on class libraries. The data acquisition and analysis system for data acquisition, display and storage, commission control, analysis, laboratory wave playback, transmission and fault diagnosis, and other functions into an organic whole, has the advantages of small volume, can be embedded, high speed, portable, simple operation, etc. By photoelectric tracking turntable as experimental object, carries on the system software and hardware alignment, the experimental results show that the system can realize the data acquisition, analysis and processing of photoelectric tracking equipment and control of turntable debugging good, and measurement results are accurate, reliable and good maintainability and extensibility. The research design for advancing the photoelectric tracking measurement equipment debugging for diagnosis and condition monitoring and fault analysis as well as the standardization and normalization of the interface and improve the maintainability of equipment is of great significance, and has certain innovative and practical value.

  18. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future. PMID:25430050

  19. Possible overlapping time frames of acquisition and consolidation phases in object memory processes: a pharmacological approach.

    PubMed

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when administered within 2 min after the acquisition trial. Likewise, both PDE5-I and PDE4-I reversed the scopolamine deficit model when administered within 2 min after the learning trial. PDE5-I was effective up to 45 min after the acquisition trial and PDE4-I was effective when administered between 3 and 5.5 h after the acquisition trial. Taken together, our study suggests that acetylcholine, cGMP, and cAMP are all involved in acquisition processes and that cGMP and cAMP are also involved in early and late consolidation processes, respectively. Most important, these pharmacological studies suggest that acquisition processes continue for some time after the learning trial where they share a short common time frame with early consolidation processes. Additional brain concentration measurements of the drugs suggest that these acquisition processes can continue up to 4-6 min after learning. PMID:26670184

  20. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  1. Superimposed fringe projection for three-dimensional shape acquisition by image analysis

    SciTech Connect

    Sasso, Marco; Chiappini, Gianluca; Palmieri, Giacomo; Amodio, Dario

    2009-05-01

    The aim in this work is the development of an image analysis technique for 3D shape acquisition, based on luminous fringe projections. In more detail, the method is based on the simultaneous use of several projectors, which is desirable whenever the surface under inspection has a complex geometry, with undercuts or shadow areas. In these cases, the usual fringe projection technique needs to perform several acquisitions, each time moving the projector or using several projectors alternately. Besides the procedure of fringe projection and phase calculation, an unwrap algorithm has been developed in order to obtain continuous phase maps needed in following calculations for shape extraction. With the technique of simultaneous projections, oriented in such a way to cover all of the surface, it is possible to increase the speed of the acquisition process and avoid the postprocessing problems related to the matching of different point clouds.

  2. A multiple process solution to the logical problem of language acquisition*

    PubMed Central

    MACWHINNEY, BRIAN

    2006-01-01

    Many researchers believe that there is a logical problem at the center of language acquisition theory. According to this analysis, the input to the learner is too inconsistent and incomplete to determine the acquisition of grammar. Moreover, when corrective feedback is provided, children tend to ignore it. As a result, language learning must rely on additional constraints from universal grammar. To solve this logical problem, theorists have proposed a series of constraints and parameterizations on the form of universal grammar. Plausible alternatives to these constraints include: conservatism, item-based learning, indirect negative evidence, competition, cue construction, and monitoring. Careful analysis of child language corpora has cast doubt on claims regarding the absence of positive exemplars. Using demonstrably available positive data, simple learning procedures can be formulated for each of the syntactic structures that have traditionally motivated invocation of the logical problem. Within the perspective of emergentist theory (MacWhinney, 2001), the operation of a set of mutually supportive processes is viewed as providing multiple buffering for developmental outcomes. However, the fact that some syntactic structures are more difficult to learn than others can be used to highlight areas of intense grammatical competition and processing load. PMID:15658750

  3. A sophisticated, multi-channel data acquisition and processing system for high frequency noise research

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Bridges, James

    1992-01-01

    A sophisticated, multi-channel computerized data acquisition and processing system was developed at the NASA LeRC for use in noise experiments. This technology, which is available for transfer to industry, provides a convenient, cost-effective alternative to analog tape recording for high frequency acoustic measurements. This system provides 32-channel acquisition of microphone signals with an analysis bandwidth up to 100 kHz per channel. Cost was minimized through the use of off-the-shelf components. Requirements to allow for future expansion were met by choosing equipment which adheres to established industry standards for hardware and software. Data processing capabilities include narrow band and 1/3 octave spectral analysis, compensation for microphone frequency response/directivity, and correction of acoustic data to standard day conditions. The system was used successfully in a major wind tunnel test program at NASA LeRC to acquire and analyze jet noise data in support of the High Speed Civil Transport (HSCT) program.

  4. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  5. The Tara control, monitoring, data acquisition and analysis system

    SciTech Connect

    Sullivan, J.D.; Gaudreau, M.P.J.; Blanter, B.; Fredian, T.W.; Irby, J.H.; Karcher, C.A.; Rameriz, R.; Sevillano, E.; Stillerman, J.; Thomas, P.

    1986-09-01

    Experiments at the MIT Tara Tandem Mirror utilize an integrated system for control, monitoring, data acquisition, physics analysis, and archiving. This system consists of two distinct parts with narrowly defined information interchange; one to provide automated control and real time monitoring of engineering functions and one to acquire, analyze, and display data for physics in near real time. Typical machine operation achieves a total cycle time of 3 to 8 minutes with 5 to 7 Mbytes of data stored and with approx.160 individual signals displayed in hardcopy on approx.10 pages.

  6. Memory acquisition and retrieval impact different epigenetic processes that regulate gene expression

    PubMed Central

    2015-01-01

    Background A fundamental question in neuroscience is how memories are stored and retrieved in the brain. Long-term memory formation requires transcription, translation and epigenetic processes that control gene expression. Thus, characterizing genome-wide the transcriptional changes that occur after memory acquisition and retrieval is of broad interest and importance. Genome-wide technologies are commonly used to interrogate transcriptional changes in discovery-based approaches. Their ability to increase scientific insight beyond traditional candidate gene approaches, however, is usually hindered by batch effects and other sources of unwanted variation, which are particularly hard to control in the study of brain and behavior. Results We examined genome-wide gene expression after contextual conditioning in the mouse hippocampus, a brain region essential for learning and memory, at all the time-points in which inhibiting transcription has been shown to impair memory formation. We show that most of the variance in gene expression is not due to conditioning and that by removing unwanted variance through additional normalization we are able provide novel biological insights. In particular, we show that genes downregulated by memory acquisition and retrieval impact different functions: chromatin assembly and RNA processing, respectively. Levels of histone 2A variant H2AB are reduced only following acquisition, a finding we confirmed using quantitative proteomics. On the other hand, splicing factor Rbfox1 and NMDA receptor-dependent microRNA miR-219 are only downregulated after retrieval, accompanied by an increase in protein levels of miR-219 target CAMKIIγ. Conclusions We provide a thorough characterization of coding and non-coding gene expression during long-term memory formation. We demonstrate that unwanted variance dominates the signal in transcriptional studies of learning and memory and introduce the removal of unwanted variance through normalization as a

  7. Analysis Method for Non-Nominal First Acquisition

    NASA Technical Reports Server (NTRS)

    Sieg, Detlef; Mugellesi-Dow, Roberta

    2007-01-01

    First this paper describes a method how the trajectory of the launcher can be modelled for the contingency analysis without having much information about the launch vehicle itself. From a dense sequence of state vectors a velocity profile is derived which is sufficiently accurate to enable the Flight Dynamics Team to integrate parts of the launcher trajectory on its own and to simulate contingency cases by modifying the velocity profile. Then the paper focuses on the thorough visibility analysis which has to follow the contingency case or burn performance simulations. In the ideal case it is possible to identify a ground station which is able to acquire the satellite independent from the burn performance. The correlations between the burn performance and the pointing at subsequent ground stations are derived with the aim of establishing simple guidelines which can be applied quickly and which significantly improve the chance of acquisition at subsequent ground stations. In the paper the method is applied to the Soyuz/Fregat launch with the MetOp satellite. Overall the paper shows that the launcher trajectory modelling with the simulation of contingency cases in connection with a ground station visibility analysis leads to a proper selection of ground stations and acquisition methods. In the MetOp case this ensured successful contact of all ground stations during the first hour after separation without having to rely on any early orbit determination result or state vector update.

  8. Phases of learning: How skill acquisition impacts cognitive processing.

    PubMed

    Tenison, Caitlin; Fincham, Jon M; Anderson, John R

    2016-06-01

    This fMRI study examines the changes in participants' information processing as they repeatedly solve the same mathematical problem. We show that the majority of practice-related speedup is produced by discrete changes in cognitive processing. Because the points at which these changes take place vary from problem to problem, and the underlying information processing steps vary in duration, the existence of such discrete changes can be hard to detect. Using two converging approaches, we establish the existence of three learning phases. When solving a problem in one of these learning phases, participants can go through three cognitive stages: Encoding, Solving, and Responding. Each cognitive stage is associated with a unique brain signature. Using a bottom-up approach combining multi-voxel pattern analysis and hidden semi-Markov modeling, we identify the duration of that stage on any particular trial from participants brain activation patterns. For our top-down approach we developed an ACT-R model of these cognitive stages and simulated how they change over the course of learning. The Solving stage of the first learning phase is long and involves a sequence of arithmetic computations. Participants transition to the second learning phase when they can retrieve the answer, thereby drastically reducing the duration of the Solving stage. With continued practice, participants then transition to the third learning phase when they recognize the problem as a single unit and produce the answer as an automatic response. The duration of this third learning phase is dominated by the Responding stage. PMID:27018936

  9. Instrumentation for automated acquisition and analysis of TLD glow curves

    NASA Astrophysics Data System (ADS)

    Bostock, I. J.; Kennett, T. J.; Harvey, J. W.

    1991-04-01

    Instrumentation for the automated and complete acquisition of thermoluminescent dosimeter (TLD) data from a Panasonic UD-702E TLD reader is reported. The system that has been developed consists of both hardware and software components and is designed to operate with an IBM-type personal computer. Acquisition of glow curve, timing, and heating data has been integrated with elementary numerical analysis to permit real-time validity and diagnostic assessments to be made. This allows the optimization of critical parameters such as duration of the heating cycles and the time window for the integration of the dosimetry peak. The form of the Li 2B 4O 7:Cu TLD glow curve has been studied and a mathematical representation devised to assist in the implementation of automated analysis. Differences in the shape of the curve can be used to identify dosimetry peaks due to artifacts or to identify failing components. Examples of the use of this system for quality assurance in the TLD monitoring program at McMaster University are presented.

  10. Relationships among process skills development, knowledge acquisition, and gender in microcomputer-based chemistry laboratories

    NASA Astrophysics Data System (ADS)

    Krieger, Carla Repsher

    This study investigated how instruction in MBL environments can be designed to facilitate process skills development and knowledge acquisition among high school chemistry students. Ninety-eight college preparatory chemistry students in six intact classes were randomly assigned to one of three treatment groups: MBL with enhanced instruction in Macroscopic knowledge, MBL with enhanced instruction in Microscopic knowledge, and MBL with enhanced instruction in Symbolic knowledge. Each treatment group completed a total of four MBL titrations involving acids and bases. After the first and third titrations, the Macroscopic, Microscopic and Symbolic groups received enhanced instruction in the Macroscopic, Microscopic and Symbolic modes, respectively. During each titration, participants used audiotapes to record their verbal interactions. The study also explored the effects of three potential covariates (age, mathematics background, and computer usage) on the relationships among the independent variables (type of enhanced instruction and gender) and the dependent variables (science process skills and knowledge acquisition). Process skills were measured via gain scores on a standardized test. Analysis of Covariance eliminated age, mathematics background, and computer usage as covariates in this study. Analysis of Variance identified no significant effects on process skills attributable to treatment or gender. Knowledge acquisition was assessed via protocol analysis of statements made by the participants during the four titrations. Statements were categorized as procedural, observational, conceptual/analytical, or miscellaneous. Statement category percentages were analyzed for trends across treatments, genders, and experiments. Instruction emphasizing the Macroscopic mode may have increased percentages of observational and miscellaneous statements and decreased percentages of procedural and conceptual/analytical statements. Instruction emphasizing the Symbolic mode may have

  11. Exploitation of realistic computational anthropomorphic phantoms for the optimization of nuclear imaging acquisition and processing protocols.

    PubMed

    Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C

    2014-01-01

    Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies. PMID:25570355

  12. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  13. On Accuracy of Knowledge Acquisition for Decision Making Processes Acquiring Subjective Information on the Internet

    NASA Astrophysics Data System (ADS)

    Fujimoto, Kazunori; Yamamoto, Yutaka

    This paper presents a mathematical model for decision making processes where the knowledge for the decision is constructed automatically from subjective information on the Internet. This mathematical model enables us to know the required degree of accuracy of knowledge acquisition for constructing decision support systems using two technologies: automated knowledge acquisition from information on the Internet and automated reasoning about the acquired knowledge. The model consists of three elements: knowledge source, which is a set of subjective information on the Internet, knowledge acquisition, which acquires knowledge base within a computer from the knowledge source, and decision rule, which chooses a set of alternatives by using the knowledge base. One of the important features of this model is that the model contains not only decision making processes but also knowledge acquisition processes. This feature enables to analyze the decision processes with the sufficiency of knowledge sources and the accuracy of knowledge acquisition methods. Based on the model, decision processes by which the knowledge source and the knowledge base lead to the same choices are given and the required degree of accuracy of knowledge acquisition is quantified as required accuracy value. In order to show the way to utilize the value for designing the decision support systems, the value is calculated by using some examples of knowledge sources and decision rules. This paper also describes the computational complexity of the required accuracy value calculation and shows a computation principle for reducing the complexity to the polynomial order of the size of knowledge sources.

  14. Evolutionary analysis of iron (Fe) acquisition system in Marchantia polymorpha.

    PubMed

    Lo, Jing-Chi; Tsednee, Munkhtsetseg; Lo, Ying-Chu; Yang, Shun-Chung; Hu, Jer-Ming; Ishizaki, Kimitsune; Kohchi, Takayuki; Lee, Der-Chuen; Yeh, Kuo-Chen

    2016-07-01

    To acquire appropriate iron (Fe), vascular plants have developed two unique strategies, the reduction-based strategy I of nongraminaceous plants for Fe(2+) and the chelation-based strategy II of graminaceous plants for Fe(3+) . However, the mechanism of Fe uptake in bryophytes, the earliest diverging branch of land plants and dominant in gametophyte generation is less clear. Fe isotope fractionation analysis demonstrated that the liverwort Marchantia polymorpha uses reduction-based Fe acquisition. Enhanced activities of ferric chelate reductase and proton ATPase were detected under Fe-deficient conditions. However, M. polymorpha did not show mugineic acid family phytosiderophores, the key components of strategy II, or the precursor nicotianamine. Five ZIP (ZRT/IRT-like protein) homologs were identified and speculated to be involved in Fe uptake in M. polymorpha. MpZIP3 knockdown conferred reduced growth under Fe-deficient conditions, and MpZIP3 overexpression increased Fe content under excess Fe. Thus, a nonvascular liverwort, M. polymorpha, uses strategy I for Fe acquisition. This system may have been acquired in the common ancestor of land plants and coopted from the gametophyte to sporophyte generation in the evolution of land plants. PMID:26948158

  15. MIRAGE: The data acquisition, analysis, and display system

    NASA Technical Reports Server (NTRS)

    Rosser, Robert S.; Rahman, Hasan H.

    1993-01-01

    Developed for the NASA Johnson Space Center and Life Sciences Directorate by GE Government Services, the Microcomputer Integrated Real-time Acquisition Ground Equipment (MIRAGE) system is a portable ground support system for Spacelab life sciences experiments. The MIRAGE system can acquire digital or analog data. Digital data may be NRZ-formatted telemetry packets of packets from a network interface. Analog signal are digitized and stored in experimental packet format. Data packets from any acquisition source are archived to a disk as they are received. Meta-parameters are generated from the data packet parameters by applying mathematical and logical operators. Parameters are displayed in text and graphical form or output to analog devices. Experiment data packets may be retransmitted through the network interface. Data stream definition, experiment parameter format, parameter displays, and other variables are configured using spreadsheet database. A database can be developed to support virtually any data packet format. The user interface provides menu- and icon-driven program control. The MIRAGE system can be integrated with other workstations to perform a variety of functions. The generic capabilities, adaptability and ease of use make the MIRAGE a cost-effective solution to many experimental data processing requirements.

  16. Xenbase: Core features, data acquisition, and data processing.

    PubMed

    James-Zorn, Christina; Ponferrada, Virgillio G; Burns, Kevin A; Fortriede, Joshua D; Lotay, Vaneet S; Liu, Yu; Brad Karpinka, J; Karimi, Kamran; Zorn, Aaron M; Vize, Peter D

    2015-08-01

    Xenbase, the Xenopus model organism database (www.xenbase.org), is a cloud-based, web-accessible resource that integrates the diverse genomic and biological data from Xenopus research. Xenopus frogs are one of the major vertebrate animal models used for biomedical research, and Xenbase is the central repository for the enormous amount of data generated using this model tetrapod. The goal of Xenbase is to accelerate discovery by enabling investigators to make novel connections between molecular pathways in Xenopus and human disease. Our relational database and user-friendly interface make these data easy to query and allows investigators to quickly interrogate and link different data types in ways that would otherwise be difficult, time consuming, or impossible. Xenbase also enhances the value of these data through high-quality gene expression curation and data integration, by providing bioinformatics tools optimized for Xenopus experiments, and by linking Xenopus data to other model organisms and to human data. Xenbase draws in data via pipelines that download data, parse the content, and save them into appropriate files and database tables. Furthermore, Xenbase makes these data accessible to the broader biomedical community by continually providing annotated data updates to organizations such as NCBI, UniProtKB, and Ensembl. Here, we describe our bioinformatics, genome-browsing tools, data acquisition and sharing, our community submitted and literature curation pipelines, text-mining support, gene page features, and the curation of gene nomenclature and gene models. PMID:26150211

  17. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  18. Design and implementation of photoelectric rotary table data acquisition and analysis system host computer software based on VC++ and MFC

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table is mainly used in the defense industry and military fields, especially in the shooting range, target tracking, target acquisition, aerospace aspects play an important one. For range photoelectric measuring equipment field test application requirements, combined with a portable photoelectric rotary table data acquisition hardware system, software programming platform is presented based on the VC++, using MFC prepared PC interface, the realization of photoelectric turntable data acquisition, analysis and processing and debugging control. The host computer software design of serial communication and protocol, real-time data acquisition and display, real-time data curve drawing, analog acquisition, debugging guide, error analysis program, and gives the specific design method. Finally, through the photoelectric rotary table data acquisition hardware system alignment, the experimental results show that host computer software can better accomplish with lower machine data transmission, data acquisition, control and analysis, and to achieve the desired effect, the entire software system running performance is stable, flexible, strong practicality and reliability, the advantages of good scalability.

  19. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  20. 76 FR 68037 - Federal Acquisition Regulation; Sudan Waiver Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... Regulation; Sudan Waiver Process AGENCIES: Department of Defense (DoD), General Services Administration (GSA... that conducts restricted business operations in Sudan. The rule also describes the consultation process... Federal Register at 75 FR 62069 on October 7, 2010, to revise FAR 25.702, Prohibition on contracting...

  1. Statistical analysis of target acquisition sensor modeling experiments

    NASA Astrophysics Data System (ADS)

    Deaver, Dawne M.; Moyer, Steve

    2015-05-01

    The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.

  2. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  3. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  4. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  5. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 36.602-5 Short selection process...

  6. Lock Acquisition and Sensitivity Analysis of Advanced LIGO Interferometers

    NASA Astrophysics Data System (ADS)

    Martynov, Denis

    Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe. The initial phase of LIGO started in 2002, and since then data was collected during the six science runs. Instrument sensitivity improved from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010. In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation of detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted until 2014. This thesis describes results of commissioning work done at the LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers. The first part of this thesis is devoted to the description of methods for bringing the interferometer into linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details. Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in

  7. The acquisition process of musical tonal schema: implications from connectionist modeling.

    PubMed

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-Ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical 'scale' sensitivity early and 'harmony' sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and PMID:26441725

  8. A computational model associating learning process, word attributes, and age of acquisition.

    PubMed

    Hidaka, Shohei

    2013-01-01

    We propose a new model-based approach linking word learning to the age of acquisition (AoA) of words; a new computational tool for understanding the relationships among word learning processes, psychological attributes, and word AoAs as measures of vocabulary growth. The computational model developed describes the distinct statistical relationships between three theoretical factors underpinning word learning and AoA distributions. Simply put, this model formulates how different learning processes, characterized by change in learning rate over time and/or by the number of exposures required to acquire a word, likely result in different AoA distributions depending on word type. We tested the model in three respects. The first analysis showed that the proposed model accounts for empirical AoA distributions better than a standard alternative. The second analysis demonstrated that the estimated learning parameters well predicted the psychological attributes, such as frequency and imageability, of words. The third analysis illustrated that the developmental trend predicted by our estimated learning parameters was consistent with relevant findings in the developmental literature on word learning in children. We further discuss the theoretical implications of our model-based approach. PMID:24223699

  9. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P. ); Elliott, A. )

    1992-01-01

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  10. Optical signal acquisition and processing in future accelerator diagnostics

    SciTech Connect

    Jackson, G.P.; Elliott, A.

    1992-12-31

    Beam detectors such as striplines and wall current monitors rely on matched electrical networks to transmit and process beam information. Frequency bandwidth, noise immunity, reflections, and signal to noise ratio are considerations that require compromises limiting the quality of the measurement. Recent advances in fiber optics related technologies have made it possible to acquire and process beam signals in the optical domain. This paper describes recent developments in the application of these technologies to accelerator beam diagnostics. The design and construction of an optical notch filter used for a stochastic cooling system is used as an example. Conceptual ideas for future beam detectors are also presented.

  11. Isolating Intrinsic Processing Disorders from Second Language Acquisition.

    ERIC Educational Resources Information Center

    Lock, Robin H.; Layton, Carol A.

    2002-01-01

    Evaluation of the validity of the Learning Disabilities Diagnostic Inventory with limited-English-proficient (LEP) students in grades 2-7 found that nondisabled LEP students were over-identified as having intrinsic processing deficits. Examination of individual student protocols highlighted the need to train teacher-raters in language acquisition…

  12. Accelerating COTS Middleware Acquisition: The i-Mate Process

    SciTech Connect

    Liu, Anna; Gorton, Ian

    2003-03-05

    Most major organizations now use some commercial-off-the-shelf middleware components to run their businesses. Key drivers behind this growth include ever-increasing Internet usage and the ongoing need to integrate heterogeneous legacy systems to streamline business processes. As organizations do more business online, they need scalable, high-performance software infrastructures to handle transactions and provide access to core systems.

  13. A dual process account of coarticulation in motor skill acquisition.

    PubMed

    Shah, Ashvin; Barto, Andrew G; Fagg, Andrew H

    2013-01-01

    Many tasks, such as typing a password, are decomposed into a sequence of subtasks that can be accomplished in many ways. Behavior that accomplishes subtasks in ways that are influenced by the overall task is often described as "skilled" and exhibits coarticulation. Many accounts of coarticulation use search methods that are informed by representations of objectives that define skilled. While they aid in describing the strategies the nervous system may follow, they are computationally complex and may be difficult to attribute to brain structures. Here, the authors present a biologically- inspired account whereby skilled behavior is developed through 2 simple processes: (a) a corrective process that ensures that each subtask is accomplished, but does not do so skillfully and (b) a reinforcement learning process that finds better movements using trial and error search that is not informed by representations of any objectives. We implement our account as a computational model controlling a simulated two-armed kinematic "robot" that must hit a sequence of goals with its hands. Behavior displays coarticulation in terms of which hand was chosen, how the corresponding arm was used, and how the other arm was used, suggesting that the account can participate in the development of skilled behavior. PMID:24116847

  14. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  15. [An image acquisition & processing system of the wireless endoscope based on DSP].

    PubMed

    Zhang, Jin-hua; Peng, Cheng-lin; Zhao, De-chun; Yang-Li

    2006-07-01

    This paper covers an image acquisition & processing system of the capsule-style endoscope. Images sent by the endoscope are compressed and encoded with the digital signal processor (DSP) saving data in HD into PC for analyzing and processing in the image browser workstation. PMID:17039927

  16. Data acquisition and online processing requirements for experimentation at the Superconducting Super Collider

    SciTech Connect

    Lankford, A.J.; Barsotti, E.; Gaines, I.

    1989-07-01

    Differences in scale between data acquisition and online processing requirements for detectors at the Superconducting Super Collider and systems for existing large detectors will require new architectures and technological advances in these systems. Emerging technologies will be employed for data transfer, processing, and recording. 9 refs., 3 figs.

  17. Pulsed laser noise analysis and pump-probe signal detection with a data acquisition card

    NASA Astrophysics Data System (ADS)

    Werley, Christopher A.; Teo, Stephanie M.; Nelson, Keith A.

    2011-12-01

    A photodiode and data acquisition card whose sampling clock is synchronized to the repetition rate of a laser are used to measure the energy of each laser pulse. Simple analysis of the data yields the noise spectrum from very low frequencies up to half the repetition rate and quantifies the pulse energy distribution. When two photodiodes for balanced detection are used in combination with an optical modulator, the technique is capable of detecting very weak pump-probe signals (ΔI/I0 ˜ 10-5 at 1 kHz), with a sensitivity that is competitive with a lock-in amplifier. Detection with the data acquisition card is versatile and offers many advantages including full quantification of noise during each stage of signal processing, arbitrary digital filtering in silico after data collection is complete, direct readout of percent signal modulation, and easy adaptation for fast scanning of delay between pump and probe.

  18. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  19. Sensor Data Acquisition and Processing Parameters for Human Activity Classification

    PubMed Central

    Bersch, Sebastian D.; Azzi, Djamel; Khusainov, Rinat; Achumba, Ifeyinwa E.; Ries, Jana

    2014-01-01

    It is known that parameter selection for data sampling frequency and segmentation techniques (including different methods and window sizes) has an impact on the classification accuracy. For Ambient Assisted Living (AAL), no clear information to select these parameters exists, hence a wide variety and inconsistency across today's literature is observed. This paper presents the empirical investigation of different data sampling rates, segmentation techniques and segmentation window sizes and their effect on the accuracy of Activity of Daily Living (ADL) event classification and computational load for two different accelerometer sensor datasets. The study is conducted using an ANalysis Of VAriance (ANOVA) based on 32 different window sizes, three different segmentation algorithm (with and without overlap, totaling in six different parameters) and six sampling frequencies for nine common classification algorithms. The classification accuracy is based on a feature vector consisting of Root Mean Square (RMS), Mean, Signal Magnitude Area (SMA), Signal Vector Magnitude (here SMV), Energy, Entropy, FFTPeak, Standard Deviation (STD). The results are presented alongside recommendations for the parameter selection on the basis of the best performing parameter combinations that are identified by means of the corresponding Pareto curve. PMID:24599189

  20. A Future Vision of a Data Acquisition: Distributed Sensing, Processing, and Health Monitoring

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Solano, Wanda; Thurman, Charles; Schmalzel, John

    2000-01-01

    This paper presents a vision fo a highly enhanced data acquisition and health monitoring system at NASA Stennis Space Center (SSC) rocket engine test facility. This vision includes the use of advanced processing capabilities in conjunction with highly autonomous distributed sensing and intelligence, to monitor and evaluate the health of data in the context of it's associated process. This method is expected to significantly reduce data acquisitions costs and improve system reliability. A Universal Signal Conditioning Amplifier (USCA) based system, under development at Kennedy Space Center, is being evaluated for adaptation to the SSC testing infrastructure. Kennedy's USCA architecture offers many advantages including flexible and auto-configuring data acquisition with improved calibration and verifiability. Possible enhancements at SSC may include multiplexing the distributed USCAs to reduce per channel cost, and the use of IEEE-485 to Allen-Bradley Control Net Gateways for interfacing with the resident control systems.

  1. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  2. Ultimate Attainment in Second Language Acquisition: Near-Native Sentence Processing in Spanish

    ERIC Educational Resources Information Center

    Jegerski, Jill

    2010-01-01

    A study of near-native sentence processing was carried out using the self-paced reading method. Twenty-three near-native speakers of Spanish were identified on the basis of native-like proficiency, age of onset of acquisition after 15 years, and a minimum of three years ongoing residency in Spanish-speaking countries. The sentence comprehension…

  3. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C 6.0. This system can be used easily and quickly for drawing grid maps in-field, creating parameters for grid-reorganization, guiding in-field data collection, converting data between diffe...

  4. A Problem-Based Learning Model for Teaching the Instructional Design Business Acquisition Process.

    ERIC Educational Resources Information Center

    Kapp, Karl M.; Phillips, Timothy L.; Wanner, Janice H.

    2002-01-01

    Outlines a conceptual framework for using a problem-based learning model for teaching the Instructional Design Business Acquisition Process. Discusses writing a response to a request for proposal, developing a working prototype, orally presenting the solution, and the impact of problem-based learning on students' perception of their confidence in…

  5. Learning and Individual Differences: An Ability/Information-Processing Framework for Skill Acquisition. Final Report.

    ERIC Educational Resources Information Center

    Ackerman, Phillip L.

    A program of theoretical and empirical research focusing on the ability determinants of individual differences in skill acquisition is reviewed. An integrative framework for information-processing and cognitive ability determinants of skills is reviewed, along with principles for ability-skill relations. Experimental manipulations were used to…

  6. Development of a data acquisition and processing system for precision agriculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A data acquisition and processing system for precision agriculture was developed by using MapX5.0 and Visual C6.0. This system can be used easily and quickly for drawing grid maps in-field, making out parameters for grid-reorganization, guiding for in-field data collection, converting data between ...

  7. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  8. Processes of Language Acquisition in Children with Autism: Evidence from Preferential Looking

    ERIC Educational Resources Information Center

    Swensen, Lauren D.; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R.

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency…

  9. Optimizing Federal Fleet Vehicle Acquisitions: An Eleven-Agency FY 2012 Analysis

    SciTech Connect

    Singer, M.; Daley, R.

    2015-02-01

    This report focuses on the National Renewable Energy Laboratory's (NREL) fiscal year (FY) 2012 effort that used the NREL Optimal Vehicle Acquisition (NOVA) analysis to identify optimal vehicle acquisition recommendations for eleven diverse federal agencies. Results of the study show that by following a vehicle acquisition plan that maximizes the reduction in greenhouse gas (GHG) emissions, significant progress is also made toward the mandated complementary goals of acquiring alternative fuel vehicles, petroleum use reduction, and alternative fuel use increase.

  10. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 636.602-5 Short selection processes for contracts not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  11. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 636.602-5 Short selection processes for contracts not to exceed the simplified acquisition threshold. The short selection process described in FAR...

  12. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... exceed the simplified acquisition threshold, either or both of the short selection processes set out...

  13. Wireless photoplethysmographic device for heart rate variability signal acquisition and analysis.

    PubMed

    Reyes, Ivan; Nazeran, Homer; Franco, Mario; Haltiwanger, Emily

    2012-01-01

    The photoplethysmographic (PPG) signal has the potential to aid in the acquisition and analysis of heart rate variability (HRV) signal: a non-invasive quantitative marker of the autonomic nervous system that could be used to assess cardiac health and other physiologic conditions. A low-power wireless PPG device was custom-developed to monitor, acquire and analyze the arterial pulse in the finger. The system consisted of an optical sensor to detect arterial pulse as variations in reflected light intensity, signal conditioning circuitry to process the reflected light signal, a microcontroller to control PPG signal acquisition, digitization and wireless transmission, a receiver to collect the transmitted digital data and convert them back to their analog representations. A personal computer was used to further process the captured PPG signals and display them. A MATLAB program was then developed to capture the PPG data, detect the RR peaks, perform spectral analysis of the PPG data, and extract the HRV signal. A user-friendly graphical user interface (GUI) was developed in LabView to display the PPG data and their spectra. The performance of each module (sensing unit, signal conditioning, wireless transmission/reception units, and graphical user interface) was assessed individually and the device was then tested as a whole. Consequently, PPG data were obtained from five healthy individuals to test the utility of the wireless system. The device was able to reliably acquire the PPG signals from the volunteers. To validate the accuracy of the MATLAB codes, RR peak information from each subject was fed into Kubios software as a text file. Kubios was able to generate a report sheet with the time domain and frequency domain parameters of the acquired data. These features were then compared against those calculated by MATLAB. The preliminary results demonstrate that the prototype wireless device could be used to perform HRV signal acquisition and analysis. PMID:23366333

  14. Self-organizing Symbol Acquisition and Motion Generation based on Dynamics-based Information Processing System

    NASA Astrophysics Data System (ADS)

    Okada, Masafumi; Nakamura, Daisuke; Nakamura, Yoshihiko

    The symbol acquisition and manipulation abilities are one of the inherent characteristics of human beings comparing with other creatures. In this paper, based on recurrent self-organizing map and dynamics-based information processing system, we propose a dynamics based self-organizing map (DBSOM). This method enables designing a topological map using time sequence data, which causes recognition and generation of the robot motion. Using this method, we design the self-organizing symbol acquisition system and robot motion generation system for a humanoid robot. By implementing DBSOM to the robot in the real world, we realize the symbol acquisition from the experimental data and investigate the spatial property of the obtained DBSOM.

  15. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  16. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1036.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... process....

  17. Performance of a VME-based parallel processing LIDAR data acquisition system (summary)

    SciTech Connect

    Moore, K.; Buttler, B.; Caffrey, M.; Soriano, C.

    1995-05-01

    It may be possible to make accurate real time, autonomous, 2 and 3 dimensional wind measurements remotely with an elastic backscatter Light Detection and Ranging (LIDAR) system by incorporating digital parallel processing hardware into the data acquisition system. In this paper, we report the performance of a commercially available digital parallel processing system in implementing the maximum correlation technique for wind sensing using actual LIDAR data. Timing and numerical accuracy are benchmarked against a standard microprocessor impementation.

  18. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. PMID:26720258

  19. Advances in diffusion MRI acquisition and processing in the Human Connectome Project.

    PubMed

    Sotiropoulos, Stamatios N; Jbabdi, Saad; Xu, Junqian; Andersson, Jesper L; Moeller, Steen; Auerbach, Edward J; Glasser, Matthew F; Hernandez, Moises; Sapiro, Guillermo; Jenkinson, Mark; Feinberg, David A; Yacoub, Essa; Lenglet, Christophe; Van Essen, David C; Ugurbil, Kamil; Behrens, Timothy E J

    2013-10-15

    The Human Connectome Project (HCP) is a collaborative 5-year effort to map human brain connections and their variability in healthy adults. A consortium of HCP investigators will study a population of 1200 healthy adults using multiple imaging modalities, along with extensive behavioral and genetic data. In this overview, we focus on diffusion MRI (dMRI) and the structural connectivity aspect of the project. We present recent advances in acquisition and processing that allow us to obtain very high-quality in-vivo MRI data, whilst enabling scanning of a very large number of subjects. These advances result from 2 years of intensive efforts in optimising many aspects of data acquisition and processing during the piloting phase of the project. The data quality and methods described here are representative of the datasets and processing pipelines that will be made freely available to the community at quarterly intervals, beginning in 2013. PMID:23702418

  20. Advances in diffusion MRI acquisition and processing in the Human Connectome Project

    PubMed Central

    Sotiropoulos, Stamatios N; Jbabdi, Saad; Xu, Junqian; Andersson, Jesper L; Moeller, Steen; Auerbach, Edward J; Glasser, Matthew F; Hernandez, Moises; Sapiro, Guillermo; Jenkinson, Mark; Feinberg, David A; Yacoub, Essa; Lenglet, Christophe; Ven Essen, David C; Ugurbil, Kamil; Behrens, Timothy EJ

    2013-01-01

    The Human Connectome Project (HCP) is a collaborative 5-year effort to map human brain connections and their variability in healthy adults. A consortium of HCP investigators will study a population of 1200 healthy adults using multiple imaging modalities, along with extensive behavioral and genetic data. In this overview, we focus on diffusion MRI (dMRI) and the structural connectivity aspect of the project. We present recent advances in acquisition and processing that allow us to obtain very high-quality in-vivo MRI data, while enabling scanning of a very large number of subjects. These advances result from 2 years of intensive efforts in optimising many aspects of data acquisition and processing during the piloting phase of the project. The data quality and methods described here are representative of the datasets and processing pipelines that will be made freely available to the community at quarterly intervals, beginning in 2013. PMID:23702418

  1. Professional identity acquisition process model in interprofessional education using structural equation modelling: 10-year initiative survey.

    PubMed

    Kururi, Nana; Tozato, Fusae; Lee, Bumsuk; Kazama, Hiroko; Katsuyama, Shiori; Takahashi, Maiko; Abe, Yumiko; Matsui, Hiroki; Tokita, Yoshiharu; Saitoh, Takayuki; Kanaizumi, Shiomi; Makino, Takatoshi; Shinozaki, Hiromitsu; Yamaji, Takehiko; Watanabe, Hideomi

    2016-01-01

    The mandatory interprofessional education (IPE) programme at Gunma University, Japan, was initiated in 1999. A questionnaire of 10 items to assess the students' understanding of the IPE training programme has been distributed since then, and the factor analysis of the responses revealed that it was categorised into four subscales, i.e. "professional identity", "structure and function of training facilities", "teamwork and collaboration", and "role and responsibilities", and suggested that these may take into account the development of IPE programme with clinical training. The purpose of this study was to examine the professional identity acquisition process (PIAP) model in IPE using structural equation modelling (SEM). Overall, 1,581 respondents of a possible 1,809 students from the departments of nursing, laboratory sciences, physical therapy, and occupational therapy completed the questionnaire. The SEM technique was utilised to construct a PIAP model on the relationships among four factors. The original PIAP model showed that "professional identity" was predicted by two factors, namely "role and responsibilities" and "teamwork and collaboration". These two factors were predicted by the factor "structure and function of training facilities". The same structure was observed in nursing and physical therapy students' PIAP models, but it was not completely the same in laboratory sciences and occupational therapy students' PIAP models. A parallel but not isolated curriculum on expertise unique to the profession, which may help to understand their professional identity in combination with learning the collaboration, may be necessary. PMID:26930464

  2. Micro-MRI-based image acquisition and processing system for assessing the response to therapeutic intervention

    NASA Astrophysics Data System (ADS)

    Vasilić, B.; Ladinsky, G. A.; Saha, P. K.; Wehrli, F. W.

    2006-03-01

    Osteoporosis is the cause of over 1.5 million bone fractures annually. Most of these fractures occur in sites rich in trabecular bone, a complex network of bony struts and plates found throughout the skeleton. The three-dimensional structure of the trabecular bone network significantly determines mechanical strength and thus fracture resistance. Here we present a data acquisition and processing system that allows efficient noninvasive assessment of trabecular bone structure through a "virtual bone biopsy". High-resolution MR images are acquired from which the trabecular bone network is extracted by estimating the partial bone occupancy of each voxel. A heuristic voxel subdivision increases the effective resolution of the bone volume fraction map and serves a basis for subsequent analysis of topological and orientational parameters. Semi-automated registration and segmentation ensure selection of the same anatomical location in subjects imaged at different time points during treatment. It is shown with excerpts from an ongoing clinical study of early post-menopausal women, that significant reduction in network connectivity occurs in the control group while the structural integrity is maintained in the hormone replacement group. The system described should be suited for large-scale studies designed to evaluate the efficacy of therapeutic intervention in subjects with metabolic bone disease.

  3. Meteoceanographic premises for structural design purposes in the Adriatic Sea: Acquisition and processing of data

    SciTech Connect

    Rampolli, M.; Biancardi, A.; Filippi, G. De

    1996-12-31

    In 1993 the leading international standards (ISO, APOI RP2A) for the design of offshore structures drastically changed the procedure for the definition of hydrodynamic forces. In particular oil companies are required to have a detailed knowledge of the weather of the areas where they operate, if they want to maintain the previous results. Alternatively, more conservative hydrodynamic forces must be considered in the design phase. Such an increase, valuable in 20--30% of total hydrodynamic force, means heavier platform structures in new projects, and more critical elements to be inspected in existing platforms. In 1992, in order to have more reliable and safe transports to and from the platforms, Agip installed a meteo-marine sensor network in Adriatic Sea, on 13 of the over 80 producing platforms. Data collected are sent to shore via radio and operators can use real time data or 12-hour wave forecast, obtained by a statistic forecasting model. Taking advantage by such existing instruments, a project was undertaken in 1993 with the purpose of determining the extreme environmental parameters to be used by structural engineers. The network has been upgraded in order to achieve directional information of the waves and to permit short term analysis. This paper describes the data acquisition system, data processing and the achieved results.

  4. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  5. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants

    PubMed Central

    Navarro, Pedro J.; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  6. Storage-Retrieval Analysis of Paired-Associate Acquisition.

    ERIC Educational Resources Information Center

    Chechile, Richard A.; Gordon, Tracey

    A study was performed to investigate the storage and retrieval dynamics that occur during paired-associate acquisition by means of the storage-retrieval separation technique discussed recently by Chechile & Meyer (1976). Thirty subjects learned an 18-item paired-associate list to a criterion of three perfect trials. In the test phase of each…

  7. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  8. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section...

  9. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5 Short selection process for procurements not to exceed the simplified acquisition threshold. References to FAR...

  10. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process...

  11. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process... Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  12. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1336.602-5 Short selection process... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section...

  13. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5 Short selection process for procurements not to exceed the simplified acquisition threshold. References to FAR...

  14. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 836.602-5 Short selection process...

  15. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  16. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Service 436.602-5 Short selection process for contracts...

  17. Distributed real time data processing architecture for the TJ-II data acquisition system

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Barrera, E.; López, S.; Machón, D.; Vega, J.; Sánchez, E.

    2004-10-01

    This article describes the performance of a new model of architecture that has been developed for the TJ-II data acquisition system in order to increase its real time data processing capabilities. The current model consists of several compact PCI extension for instrumentation (PXI) standard chassis, each one with various digitizers. In this architecture, the data processing capability is restricted to the PXI controller's own performance. The controller must share its CPU resources between the data processing and the data acquisition tasks. In the new model, distributed data processing architecture has been developed. The solution adds one or more processing cards to each PXI chassis. This way it is possible to plan how to distribute the data processing of all acquired signals among the processing cards and the available resources of the PXI controller. This model allows scalability of the system. More or less processing cards can be added based on the requirements of the system. The processing algorithms are implemented in LabVIEW (from National Instruments), providing efficiency and time-saving application development when compared with other efficient solutions.

  18. Airborne Wind Profiling With the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2012-01-01

    A pulsed 2-micron coherent Doppler lidar system at NASA Langley Research Center in Virginia flew on the NASA's DC-8 aircraft during the NASA Genesis and Rapid Intensification Processes (GRIP) during the summer of 2010. The participation was part of the project Doppler Aerosol Wind Lidar (DAWN) Air. Selected results of airborne wind profiling are presented and compared with the dropsonde data for verification purposes. Panoramic presentations of different wind parameters over a nominal observation time span are also presented for selected GRIP data sets. The realtime data acquisition and analysis software that was employed during the GRIP campaign is introduced with its unique features.

  19. Characterization of digital signal processing in the DiDAC data acquisition system

    SciTech Connect

    Parson, J.D.; Olivier, T.L.; Habbersett, R.C.; Martin, J.C.; Wilder, M.E.; Jett, J.H. )

    1993-01-01

    A new generation data acquisition system for flow cytometers has been constructed. This Digital Data Acquisition and Control (DiDAC) system is based on the VME architecture and uses both the standard VME bus and a private bus for system communication and data transfer. At the front end of the system is a free running 20 MHz ADC. The output of a detector preamp provides the signal for digitization. The digitized waveform is passed to a custom built digital signal processing circuit that extracts the height, width, and integral of the waveform. Calculation of these parameters is started (and stopped) when the waveform exceeds (and falls below) a preset threshold value. The free running ADC is specified to have 10 bit accuracy at 25 MHZ. The authors have characterized it to the results obtained with conventional analog signal processing followed by digitization. Comparisons are made between the two approaches in terms of measurement CV, linearity and in other aspects.

  20. Processes of language acquisition in children with autism: evidence from preferential looking.

    PubMed

    Swensen, Lauren D; Kelley, Elizabeth; Fein, Deborah; Naigles, Letitia R

    2007-01-01

    Two language acquisition processes (comprehension preceding production of word order, the noun bias) were examined in 2- and 3-year-old children (n=10) with autistic spectrum disorder and in typically developing 21-month-olds (n=13). Intermodal preferential looking was used to assess comprehension of subject-verb-object word order and the tendency to map novel words onto objects rather than actions. Spontaneous speech samples were also collected. Results demonstrated significant comprehension of word order in both groups well before production. Moreover, children in both groups consistently showed the noun bias. Comprehension preceding production and the noun bias appear to be robust processes of language acquisition, observable in both typical and language-impaired populations. PMID:17381789

  1. An extended-source spatial acquisition process based on maximum likelihood criterion for planetary optical communications

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee

    1992-01-01

    This paper describes an extended-source spatial acquisition process based on the maximum likelihood criterion for interplanetary optical communications. The objective is to use the sun-lit Earth image as a receiver beacon and point the transmitter laser to the Earth-based receiver to establish a communication path. The process assumes the existence of a reference image. The uncertainties between the reference image and the received image are modeled as additive white Gaussian disturbances. It has been shown that the optimal spatial acquisition requires solving two nonlinear equations to estimate the coordinates of the transceiver from the received camera image in the transformed domain. The optimal solution can be obtained iteratively by solving two linear equations. Numerical results using a sample sun-lit Earth as a reference image demonstrate that sub-pixel resolutions can be achieved in a high disturbance environment. Spatial resolution is quantified by Cramer-Rao lower bounds.

  2. Summary of the activities of the subgroup on data acquisition and processing

    SciTech Connect

    Connolly, P.L.; Doughty, D.C.; Elias, J.E.

    1981-01-01

    A data acquisition and handling subgroup consisting of approximately 20 members met during the 1981 ISABELLE summer study. Discussions were led by members of the BNL ISABELLE Data Acquisition Group (DAG) with lively participation from outside users. Particularly large contributions were made by representatives of BNL experiments 734, 735, and the MPS, as well as the Fermilab Colliding Detector Facility and the SLAC LASS Facility. In contrast to the 1978 study, the subgroup did not divide its activities into investigations of various individual detectors, but instead attempted to review the current state-of-the-art in the data acquisition, trigger processing, and data handling fields. A series of meetings first reviewed individual pieces of the problem, including status of the Fastbus Project, the Nevis trigger processor, the SLAC 168/E and 3081/E emulators, and efforts within DAG. Additional meetings dealt with the question involving specifying and building complete data acquisition systems. For any given problem, a series of possible solutions was proposed by the members of the subgroup. In general, any given solution had both advantages and disadvantages, and there was never any consensus on which approach was best. However, there was agreement that certain problems could only be handled by systems of a given power or greater. what will be given here is a review of various solutions with associated powers, costs, advantages, and disadvantages.

  3. Digital signal processing and data acquisition employing diode lasers for lidar-hygrometer

    NASA Astrophysics Data System (ADS)

    Naboko, Sergei V.; Pavlov, Lyubomir Y.; Penchev, Stoyan P.; Naboko, Vassily N.; Pencheva, Vasilka H.; Donchev, T.

    2003-11-01

    The paper refers to novel aspects of application of the laser radar (LIDAR) to differential absorption spectroscopy and atmospheric gas monitoring, accenting on the advantages of the class of powerful pulsed laser diodes. The implementation of the task for determination of atmospheric humidity, which is a major green house gas, and the set demands of measurement match well the potential of the acquisition system. The projected system is designed by transmission of the operations to Digital Signal Processing (DSP) module allowing preservation of the informative part of the signal by real-time pre-processing and following post-processing by personal computer.

  4. Integrating data acquisition and offline processing systems for small experiments at Fermilab

    SciTech Connect

    Streets, J.; Corbin, B.; Taylor, C.

    1995-10-01

    Two small experiments at Fermilab are using the large UNIX central computing facility at Fermilab (FNALU) to analyze data. The data acquisition systems are based on {open_quotes}off the shelf{close_quotes} software packages utilizing VAX/VMS computers and CAMAC readout. As the disk available on FNALU approaches the sizes of the raw data sets taken by the experiments (50 Gbytes) we have used the Andrew File System (AFS) to serve the data to experimenters for analysis.

  5. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  6. Signal Processing, Analysis, & Display

    SciTech Connect

    Lager, Darrell; Azevado, Stephen

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  7. Hardware System for Real-Time EMG Signal Acquisition and Separation Processing during Electrical Stimulation.

    PubMed

    Hsueh, Ya-Hsin; Yin, Chieh; Chen, Yan-Hong

    2015-09-01

    The study aimed to develop a real-time electromyography (EMG) signal acquiring and processing device that can acquire signal during electrical stimulation. Since electrical stimulation output can affect EMG signal acquisition, to integrate the two elements into one system, EMG signal transmitting and processing method has to be modified. The whole system was designed in a user-friendly and flexible manner. For EMG signal processing, the system applied Altera Field Programmable Gate Array (FPGA) as the core to instantly process real-time hybrid EMG signal and output the isolated signal in a highly efficient way. The system used the power spectral density to evaluate the accuracy of signal processing, and the cross correlation showed that the delay of real-time processing was only 250 μs. PMID:26210898

  8. Age Effects on the Process of L2 Acquisition? Evidence from the Acquisition of Negation and Finiteness in L2 German

    ERIC Educational Resources Information Center

    Dimroth, Christine

    2008-01-01

    It is widely assumed that ultimate attainment in adult second language (L2) learners often differs quite radically from ultimate attainment in child L2 learners. This article addresses the question of whether learners at different ages also show qualitative differences in the process of L2 acquisition. Longitudinal production data from two…

  9. APNEA list mode data acquisition and real-time event processing

    SciTech Connect

    Hogle, R.A.; Miller, P.; Bramblett, R.L.

    1997-11-01

    The LMSC Active Passive Neutron Examinations and Assay (APNEA) Data Logger is a VME-based data acquisition system using commercial-off-the-shelf hardware with the application-specific software. It receives TTL inputs from eighty-eight {sup 3}He detector tubes and eight timing signals. Two data sets are generated concurrently for each acquisition session: (1) List Mode recording of all detector and timing signals, timestamped to 3 microsecond resolution; (2) Event Accumulations generated in real-time by counting events into short (tens of microseconds) and long (seconds) time bins following repetitive triggers. List Mode data sets can be post-processed to: (1) determine the optimum time bins for TRU assay of waste drums, (2) analyze a given data set in several ways to match different assay requirements and conditions and (3) confirm assay results by examining details of the raw data. Data Logger events are processed and timestamped by an array of 15 TMS320C40 DSPs and delivered to an embedded controller (PowerPC604) for interim disk storage. Three acquisition modes, corresponding to different trigger sources are provided. A standard network interface to a remote host system (Windows NT or SunOS) provides for system control, status, and transfer of previously acquired data. 6 figs.

  10. Automated system for acquisition and image processing for the control and monitoring boned nopal

    NASA Astrophysics Data System (ADS)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  11. Autonomous Closed-Loop Tasking, Acquisition, Processing, and Evaluation for Situational Awareness Feedback

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Dan; Cappelaere, Pat

    2016-01-01

    This presentation describes the closed loop satellite autonomy methods used to connect users and the assets on Earth Orbiter- 1 (EO-1) and similar satellites. The base layer is a distributed architecture based on Goddard Mission Services Evolution Concept (GMSEC) thus each asset still under independent control. Situational awareness is provided by a middleware layer through common Application Programmer Interface (API) to GMSEC components developed at GSFC. Users setup their own tasking requests, receive views into immediate past acquisitions in their area of interest, and into future feasibilities for acquisition across all assets. Automated notifications via pubsub feeds are returned to users containing published links to image footprints, algorithm results, and full data sets. Theme-based algorithms are available on-demand for processing.

  12. A prototype data acquisition and processing system for Schumann resonance measurements

    NASA Astrophysics Data System (ADS)

    Tatsis, Giorgos; Votis, Constantinos; Christofilakis, Vasilis; Kostarakis, Panos; Tritakis, Vasilis; Repapis, Christos

    2015-12-01

    In this paper, a cost-effective prototype data acquisition system specifically designed for Schumann resonance measurements and an adequate signal processing method are described in detail. The implemented system captures the magnetic component of the Schumann resonance signal, using a magnetic antenna, at much higher sampling rates than the Nyquist rate for efficient signal improvement. In order to obtain the characteristics of the individual resonances of the SR spectrum a new and efficient software was developed. The processing techniques used in this software are analyzed thoroughly in the following. Evaluation of system's performance and operation is realized using preliminary measurements taken in the region of Northwest Greece.

  13. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  14. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1036.602-5 Short selection...

  15. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 736.602-5...

  16. Big Data Analysis of Manufacturing Processes

    NASA Astrophysics Data System (ADS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  17. Signal Processing, Analysis, & Display

    1986-06-01

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible andmore » are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.« less

  18. Acquisition and Processing of Multi-source Technique Offshore with Different Types of Source

    NASA Astrophysics Data System (ADS)

    Li, L.; Tong, S.; Zhou, H. W.

    2015-12-01

    Multi-source blended offshore seismic acquisition has been developed in recent years. The technology aims to improve the efficiency of acquisition or enhance the image quality through the dense spatial sampling. Previous methods usually use several source of the same type, we propose applying onshore sources with different central frequencies to image multiscale target layers at different depths. Low frequency seismic source is used to image the deep structure but has low resolution at shallow depth, which can be compensated by high frequency. By combing the low and high frequency imaging together, we obtain high resolution profiles on both shallow and deep. Considering all of above, we implemented a 2-D cruise using 300Hz and 2000Hz central frequency spark source whcich are randomly shooted with certain delay time. In this process we separate blended data by denoising methods, including middle filter and curvelet transform, and then match prestack data to obtain final profiles. Median filter can restrain impulse noise and protect the edges while curvelet transform has multi-scale characteristics and powerful sparse expression ability. The iterative noise elimination can produce good results. Prestack matching filter is important when integrate wavelet of two different spark sources because of their different characteristics, making data accordant for reflecting time, amplitude, frequency and phase. By comparing with profiles used either single type of source, the image of blended acquisition shows higher resolution at shallow depth and results in more information in deep locations.

  19. Analysis of patient movement during 3D USCT data acquisition

    NASA Astrophysics Data System (ADS)

    Ruiter, N. V.; Hopp, T.; Zapf, M.; Kretzek, E.; Gemmeke, H.

    2016-04-01

    In our first clinical study with a full 3D Ultrasound Computer Tomography (USCT) system patient data was acquired in eight minutes for one breast. In this paper the patient movement during the acquisition was analyzed quantitatively and as far as possible corrected in the resulting images. The movement was tracked in ten successive reflectivity reconstructions of full breast volumes acquired during 10 s intervals at different aperture positions, which were separated by 41 s intervals. The mean distance between initial and final position was 2.2 mm (standard deviation (STD) +/- 0.9 mm, max. 4.1 mm, min. 0.8 mm) and the average sum of all moved distances was 4.9 mm (STD +/- 1.9 mm, max. 8.8 mm, min. 2.7 mm). The tracked movement was corrected by summing successive images, which were transformed according to the detected movement. The contrast of these images increased and additional image content became visible.

  20. How to crack nuts: acquisition process in captive chimpanzees (Pan troglodytes) observing a model.

    PubMed

    Hirata, Satoshi; Morimura, Naruki; Houki, Chiharu

    2009-10-01

    Stone tool use for nut cracking consists of placing a hard-shelled nut onto a stone anvil and then cracking the shell open by pounding it with a stone hammer to get to the kernel. We investigated the acquisition of tool use for nut cracking in a group of captive chimpanzees to clarify what kind of understanding of the tools and actions will lead to the acquisition of this type of tool use in the presence of a skilled model. A human experimenter trained a male chimpanzee until he mastered the use of a hammer and anvil stone to crack open macadamia nuts. He was then put in a nut-cracking situation together with his group mates, who were naïve to this tool use; we did not have a control group without a model. The results showed that the process of acquisition could be broken down into several steps, including recognition of applying pressure to the nut,emergence of the use of a combination of three objects, emergence of the hitting action, using a tool for hitting, and hitting the nut. The chimpanzees recognized these different components separately and practiced them one after another. They gradually united these factors in their behavior leading to their first success. Their behavior did not clearly improve immediately after observing successful nut cracking by a peer, but observation of a skilled group member seemed to have a gradual, long-term influence on the acquisition of nut cracking by naïve chimpanzees. PMID:19727866

  1. IECON '87: Signal acquisition and processing; Proceedings of the 1987 International Conference on Industrial Electronics, Control, and Instrumentation, Cambridge, MA, Nov. 3, 4, 1987

    NASA Astrophysics Data System (ADS)

    Niederjohn, Russell J.

    1987-01-01

    Theoretical and applications aspects of signal processing are examined in reviews and reports. Topics discussed include speech processing methods, algorithms, and architectures; signal-processing applications in motor and power control; digital signal processing; signal acquisition and analysis; and processing algorithms and applications. Consideration is given to digital coding of speech algorithms, an algorithm for continuous-time processes in discrete-time measurement, quantization noise and filtering schemes for digital control systems, distributed data acquisition for biomechanics research, a microcomputer-based differential distance and velocity measurement system, velocity observations from discrete position encoders, a real-time hardware image preprocessor, and recognition of partially occluded objects by a knowledge-based system.

  2. Object-oriented programming approach to CCD data acquisition and image processing

    NASA Astrophysics Data System (ADS)

    Naidu, B. Nagaraja; Srinivasan, R.; Shankar, S. Murali

    1997-10-01

    In the recent past both the CCD camera controller hardware and software have witnessed a dynamic change to keep pace with the astronomer's imaging requirements. Conventional data acquisition software is based on menu driven programs developed using structured high level languages in non-window environment. An application under windows offers several advantages to the users, over the non-window approach, like multitasking, accessing large memory and inter-application communication. Windows also provides many programming facilities to the developers such as device-independent graphics, support to wide range of input/output devices, menus, icons, bitmaps. However, programming for windows environment under structured programming demands an in-depth knowledge of events, formats, handles and inner workings. Object-oriented approach simplifies the task of programming for windows by using object windows which manage the message- processing behavior and insulate the developer from the details of inner workings of windows. As a result, a window application can be developed in much less time and effort compared to conventional approaches. We have designed and developed an easy-to-use CCD data acquisition and processing software under Microsoft Windows 3.1 operating environment using object-Pascal for windows. The acquisition software exploits the advantages of the objects to provide custom specific tool boxes to implement different functions of CCD data accusation and image processing. In this paper the hierarchy of the software structure and various application functions are presented. The flexibility of the software to handle different CCDs and also mosaic arrangement is illustrated.

  3. Chemical Sensing in Process Analysis.

    ERIC Educational Resources Information Center

    Hirschfeld, T.; And Others

    1984-01-01

    Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)

  4. Conceptual design and analysis methodology for knowledge acquisition for expert systems

    SciTech Connect

    Adiga, S.

    1986-01-01

    The field analysis of Artificial Intelligence, particularly expert systems, has been identified by experts as a technology with the most promise for handling complex information processing needs of modern manufacturing systems. Knowledge acquisition or the process of building the knowledge base for expert systems needs precise and well-formulated methods to pass from being an art to theory. This research in a step in that direction. The approach evolves at the conceptual level from Pask's work on conversation theory which provides the minimal structural requirement for development and validation of the method. An integrated approach is developed with guidelines for structured knowledge elicitation, analysis, and mapping of the verbal data into well-defined object-oriented generic knowledge structures capable of representing both structural and operational knowledge. The research extends and blends the concepts of protocol analysis, object-oriented design, and semantic data modeling into an integrated framework. This methodology, being a domain-independent development, theoretically can be used to acquire knowledge for any expert performance system.

  5. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    SciTech Connect

    van den Engh, Gerrit J.; Stokdijk, Willem

    1992-01-01

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate.

  6. Parallel pulse processing and data acquisition for high speed, low error flow cytometry

    DOEpatents

    Engh, G.J. van den; Stokdijk, W.

    1992-09-22

    A digitally synchronized parallel pulse processing and data acquisition system for a flow cytometer has multiple parallel input channels with independent pulse digitization and FIFO storage buffer. A trigger circuit controls the pulse digitization on all channels. After an event has been stored in each FIFO, a bus controller moves the oldest entry from each FIFO buffer onto a common data bus. The trigger circuit generates an ID number for each FIFO entry, which is checked by an error detection circuit. The system has high speed and low error rate. 17 figs.

  7. Analysis of phosphate acquisition efficiency in different Arabidopsis accessions.

    PubMed

    Narang, R A; Bruene, A; Altmann, T

    2000-12-01

    The morphological and physiological characteristics of Arabidopsis accessions differing in their phosphate acquisition efficiencies (PAEs) when grown on a sparingly soluble phosphate source (hydroxylapatite) were analyzed. A set of 36 accessions was subjected to an initial PAE evaluation following cultivation on synthetic, agarose-solidified media containing potassium phosphate (soluble) or hydroxylapatite (sparingly soluble). From the five most divergent accessions identified in this way, C24, Co, and Cal exhibited high PAEs, whereas Col-0 and Te exhibited low PAEs. These five accessions were analyzed in detail. Significant differences were found in root morphology, phosphate uptake kinetics, organic acid release, rhizosphere acidification, and the ability of roots to penetrate substrates. Long root hairs at high densities, high uptake per unit root length, and high substrate penetration ability in the efficient accessions C24 and Co mediate their high PAEs. The third accession with high PAE, Cal, exhibits a high shoot-to-root ratio, long roots with long root hairs, and rhizosphere acidification. These results are consistent with previous observations and highlight the suitability of using Arabidopsis accessions to identify and isolate genes determining the PAE in plants. PMID:11115894

  8. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  9. Pyroshock recommendations in proposed MIL-HDBK on 'Dynamic Data Acquisition and Analysis'

    NASA Astrophysics Data System (ADS)

    Piersol, Allan G.

    The pyroshock recommendations presented in a military handbook on Guidelines for Dynamic Data Acquisition and Analysis, which is being prepared by the Jet Propulsion Laboratory, are summarized. Numerous comments,including suggestions for modifications and additions to the handbook, are discussed. Particular attention is given to recommendations concerning measurement locations, transducers, signal conditioners, data recorders, data sampling, data editing, and data analysis.

  10. Data acquisition, control, and analysis for the Argonne Advanced Accelerator Test Facility (AATF)

    SciTech Connect

    Schoessow, P.

    1989-01-01

    The AATF has been used to study wakefield acceleration and focusing in plasmas and rf structures. A PC-based system is described which incorporates the functions of beamline control and acquisition, storage, and preliminary analysis of video images from luminescent screen beam diagnostics. General features of the offline analysis of wakefield data are also discussed. 4 refs., 3 figs.

  11. Visual Skills and Chinese Reading Acquisition: A Meta-Analysis of Correlation Evidence

    ERIC Educational Resources Information Center

    Yang, Ling-Yan; Guo, Jian-Peng; Richman, Lynn C.; Schmidt, Frank L.; Gerken, Kathryn C.; Ding, Yi

    2013-01-01

    This paper used meta-analysis to synthesize the relation between visual skills and Chinese reading acquisition based on the empirical results from 34 studies published from 1991 to 2011. We obtained 234 correlation coefficients from 64 independent samples, with a total of 5,395 participants. The meta-analysis revealed that visual skills as a…

  12. Pyroshock recommendations in proposed MIL-HDBK on 'Dynamic Data Acquisition and Analysis'

    NASA Technical Reports Server (NTRS)

    Piersol, Allan G.

    1991-01-01

    The pyroshock recommendations presented in a military handbook on Guidelines for Dynamic Data Acquisition and Analysis, which is being prepared by the Jet Propulsion Laboratory, are summarized. Numerous comments,including suggestions for modifications and additions to the handbook, are discussed. Particular attention is given to recommendations concerning measurement locations, transducers, signal conditioners, data recorders, data sampling, data editing, and data analysis.

  13. An Integrated Data Acquisition / User Request/ Processing / Delivery System for Airborne Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Chapman, B.; Chu, A.; Tung, W.

    2003-12-01

    Airborne science data has historically played an important role in the development of the scientific underpinnings for spaceborne missions. When the science community determines the need for new types of spaceborne measurements, airborne campaigns are often crucial in risk mitigation for these future missions. However, full exploitation of the acquired data may be difficult due to its experimental and transitory nature. Externally to the project, most problematic (in particular, for those not involved in requesting the data acquisitions) may be the difficulty in searching for, requesting, and receiving the data, or even knowing the data exist. This can result in a rather small, insular community of users for these data sets. Internally, the difficulty for the project is in maintaining a robust processing and archival system during periods of changing mission priorities and evolving technologies. The NASA/JPL Airborne Synthetic Aperture Radar (AIRSAR) has acquired data for a large and varied community of scientists and engineers for 15 years. AIRSAR is presently supporting current NASA Earth Science Enterprise experiments, such as the Soil Moisture EXperiment (SMEX) and the Cold Land Processes experiment (CLPX), as well as experiments conducted as many as 10 years ago. During that time, it's processing, data ordering, and data delivery system has undergone evolutionary change as the cost and capability of resources has improved. AIRSAR now has a fully integrated data acquisition/user request/processing/delivery system through which most components of the data fulfillment process communicate via shared information within a database. The integration of these functions has reduced errors and increased throughput of processed data to customers.

  14. A high speed data acquisition and analysis system for transonic velocity, density, and total temperature fluctuations

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1988-01-01

    The high speed Dynamic Data Acquisition System (DDAS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAS replaces both a recording mechanism and a separate data processing system. The data acquisition and data reduction process has been combined within DDAS. DDAS receives input from hot wires and anemometers, amplifies and filters the signals with computer controlled modules, and converts the analog signals to digital with real-time simultaneous digitization followed by digital recording on disk or tape. Automatic acquisition (either from a computer link to an existing wind tunnel acquisition system, or from data acquisition facilities within DDAS) collects necessary calibration and environment data. The generation of hot wire sensitivities is done in DDAS, as is the application of sensitivities to the hot wire data to generate turbulence quantities. The presentation of the raw and processed data, in terms of root mean square values of velocity, density and temperature, and the processing of the spectral data is accomplished on demand in near-real-time- with DDAS. A comprehensive description of the interface to the DDAS and of the internal mechanisms will be prosented. A summary of operations relevant to the use of the DDAS will be provided.

  15. A CCD/CMOS process for integrated image acquisition and early vision signal processing

    NASA Astrophysics Data System (ADS)

    Keast, Craig L.; Sodini, Charles G.

    The development of technology which integrates a four phase, buried-channel CCD in an existing 1.75 micron CMOS process is described. The four phase clock is employed in the integrated early vision system to minimize process complexity. Signal corruption is minimized and lateral fringing fields are enhanced by burying the channel. The CMOS process for CCD enhancement is described, which highlights a new double-poly process and the buried channel, and the integration is outlined. The functionality and transfer efficiency of the process enhancement were appraised by measuring CCD shift registers at 100 kHz. CMOS measurement results are presented, which include threshold voltages, poly-to-poly capacitor voltage and temperature coefficients, and dark current. A CCD/CMOS processor is described which combines smoothing and segmentation operations. The integration of the CCD and the CMOS processes is found to function due to the enhancement-compatible design of the CMOS process and the thorough employment of CCD module baseline process steps.

  16. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  17. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality of the System after being on the Surface for Two Years.

    NASA Astrophysics Data System (ADS)

    Beegle, L. W.; Anderson, R. C.; Abbey, W. J.

    2014-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system. SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Curiosity rover preformed several sample acquisitions and processing of solid samples during its first year of operation. Material were processed and delivered to the two analytical instruments, Chemistry and Mineralogy (CheMin) and Sample Analysis at Mars (SAM), both of which required specific particle size for the material delivered to them to perform their analysis to determine its mineralogy and geochemistry content. In this presentation, the functionality of the system will be explained along with the in-situ targets the system has acquire and the samples that were delivered.

  18. Containerless processing technology analysis

    NASA Technical Reports Server (NTRS)

    Rush, J. E.

    1982-01-01

    Research on acoustic levitation, air-jet levitation, and heat transfer from molten samples is reported. The goal was to obtain a better understanding and improving the quality of containerless processing systems. These systems are applied to the processing of materials in situations in which contact with a container must be avoided, and have potential application in both ground based and orbiting laboratories. Containerless processing is reviewed. The development of glasses from materials which normally crystallize upon cooling, are studied.

  19. Medical Knowledge Base Acquisition: The Role of the Expert Review Process in Disease Profile Construction

    PubMed Central

    Giuse, Nunzia Bettinsoli; Bankowitz, Richard A.; Giuse, Dario A.; Parker, Ronnie C.; Miller, Randolph A.

    1989-01-01

    In order to better understand the knowledge acquisition process, we studied the changes which a newly developed “preliminary” QMR disease profile undergoes during the expert review process. Changes in the ten most recently created disease profiles from the INTERNIST-1/QMR knowledge base were analyzed. We classified the changes which occurred during knowledge base construction by the type of change and the reason for the change. Observed changes to proposed findings could be grouped according to whether a change was needed to maintain consistency with the existing knowledge base, or because of disagreement over knowledge content with the domain expert. Out of 987 total proposed findings in the ten profiles, 233 findings underwent 274 changes, approximately one change for each three proposed findings. A total of 43% of the changes were additions or deletions of findings or links compared to the preliminary disease profile, and 33% of the changes were alterations in the numerical value of the evoking strength or frequency. A total of 126 (46%) of changes were required to maintain consistency of the knowledge base, whereas the remaining 148 (54%) changes were altered based on suggestions made by the domain expert based on domain content. The type of change (consistency vs. domain knowledge) was found to correlate both with the class of finding (newly constructed vs. previously used) and with the experience of the profiler (novice vs. experienced). These differences suggest that some but not all aspects of the disease profiling process can be improved upon with experience. Since it is generally agreed that the construction of a knowledge base depends heavily upon the knowledge acquisition process, this study provides some insight into areas of investigation for others interested in the construction of automated tools to aid the process of knowledge base construction. It also provides support for the observation that knowledge base construction has at least some

  20. A Rational Analysis of the Acquisition of Multisensory Representations

    ERIC Educational Resources Information Center

    Yildirim, Ilker; Jacobs, Robert A.

    2012-01-01

    How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory…

  1. Space processing applications payload equipment study. Volume 2C: Data acquisition and process control

    NASA Technical Reports Server (NTRS)

    Kayton, M.; Smith, A. G.

    1974-01-01

    The services provided by the Spacelab Information Management System are discussed. The majority of the services are provided by the common-support subsystems in the Support Module furnished by the Spacelab manufacturer. The information processing requirements for the space processing applications (SPA) are identified. The requirements and capabilities for electric power, display and control panels, recording and telemetry, intercom, and closed circuit television are analyzed.

  2. A review of breast tomosynthesis. Part I. The image acquisition process

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process. PMID:23298126

  3. A review of breast tomosynthesis. Part I. The image acquisition process

    SciTech Connect

    Sechopoulos, Ioannis

    2013-01-15

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process.

  4. Acquisition of material properties in production for sheet metal forming processes

    SciTech Connect

    Heingärtner, Jörg; Hora, Pavel; Neumann, Anja; Hortig, Dirk; Rencki, Yasar

    2013-12-16

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  5. Acquisition of material properties in production for sheet metal forming processes

    NASA Astrophysics Data System (ADS)

    Heingärtner, Jörg; Neumann, Anja; Hortig, Dirk; Rencki, Yasar; Hora, Pavel

    2013-12-01

    In past work a measurement system for the in-line acquisition of material properties was developed at IVP. This system is based on the non-destructive eddy-current principle. Using this system, a 100% control of material properties of the processed material is possible. The system can be used for ferromagnetic materials like standard steels as well as paramagnetic materials like Aluminum and stainless steel. Used as an in-line measurement system, it can be configured as a stand-alone system to control material properties and sort out inapplicable material or as part of a control system of the forming process. In both cases, the acquired data can be used as input data for numerical simulations, e.g. stochastic simulations based on real world data.

  6. Real-time multi-camera video acquisition and processing platform for ADAS

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio

    2016-04-01

    The paper presents the design of a real-time and low-cost embedded system for image acquisition and processing in Advanced Driver Assisted Systems (ADAS). The system adopts a multi-camera architecture to provide a panoramic view of the objects surrounding the vehicle. Fish-eye lenses are used to achieve a large Field of View (FOV). Since they introduce radial distortion of the images projected on the sensors, a real-time algorithm for their correction is also implemented in a pre-processor. An FPGA-based hardware implementation, re-using IP macrocells for several ADAS algorithms, allows for real-time processing of input streams from VGA automotive CMOS cameras.

  7. Squeezing through the Now-or-Never bottleneck: Reconnecting language processing, acquisition, change, and structure.

    PubMed

    Chater, Nick; Christiansen, Morten H

    2016-01-01

    If human language must be squeezed through a narrow cognitive bottleneck, what are the implications for language processing, acquisition, change, and structure? In our target article, we suggested that the implications are far-reaching and form the basis of an integrated account of many apparently unconnected aspects of language and language processing, as well as suggesting revision of many existing theoretical accounts. With some exceptions, commentators were generally supportive both of the existence of the bottleneck and its potential implications. Many commentators suggested additional theoretical and linguistic nuances and extensions, links with prior work, and relevant computational and neuroscientific considerations; some argued for related but distinct viewpoints; a few, though, felt traditional perspectives were being abandoned too readily. Our response attempts to build on the many suggestions raised by the commentators and to engage constructively with challenges to our approach. PMID:27561252

  8. A review of breast tomosynthesis. Part I. The image acquisition process.

    PubMed

    Sechopoulos, Ioannis

    2013-01-01

    Mammography is a very well-established imaging modality for the early detection and diagnosis of breast cancer. However, since the introduction of digital imaging to the realm of radiology, more advanced, and especially tomographic imaging methods have been made possible. One of these methods, breast tomosynthesis, has finally been introduced to the clinic for routine everyday use, with potential to in the future replace mammography for screening for breast cancer. In this two part paper, the extensive research performed during the development of breast tomosynthesis is reviewed, with a focus on the research addressing the medical physics aspects of this imaging modality. This first paper will review the research performed on the issues relevant to the image acquisition process, including system design, optimization of geometry and technique, x-ray scatter, and radiation dose. The companion to this paper will review all other aspects of breast tomosynthesis imaging, including the reconstruction process. PMID:23298126

  9. Accelerating Data Acquisition, Reduction, and Analysis at the Spallation Neutron Source

    SciTech Connect

    Campbell, Stuart I; Kohl, James Arthur; Granroth, Garrett E; Miller, Ross G; Doucet, Mathieu; Stansberry, Dale V; Proffen, Thomas E; Taylor, Russell J; Dillow, David

    2014-01-01

    ORNL operates the world's brightest neutron source, the Spallation Neutron Source (SNS). Funded by the US DOE Office of Basic Energy Science, this national user facility hosts hundreds of scientists from around the world, providing a platform to enable break-through research in materials science, sustainable energy, and basic science. While the SNS provides scientists with advanced experimental instruments, the deluge of data generated from these instruments represents both a big data challenge and a big data opportunity. For example, instruments at the SNS can now generate multiple millions of neutron events per second providing unprecedented experiment fidelity but leaving the user with a dataset that cannot be processed and analyzed in a timely fashion using legacy techniques. To address this big data challenge, ORNL has developed a near real-time streaming data reduction and analysis infrastructure. The Accelerating Data Acquisition, Reduction, and Analysis (ADARA) system provides a live streaming data infrastructure based on a high-performance publish subscribe system, in situ data reduction, visualization, and analysis tools, and integration with a high-performance computing and data storage infrastructure. ADARA allows users of the SNS instruments to analyze their experiment as it is run and make changes to the experiment in real-time and visualize the results of these changes. In this paper we describe ADARA, provide a high-level architectural overview of the system, and present a set of use-cases and real-world demonstrations of the technology.

  10. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes... CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS Architect-Engineer Services 1436.602-5 Short selection processes... shall be obtained prior to the utilization of either of the short selection processes used for...

  11. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  12. Risk of organism acquisition from prior room occupants: a systematic review and meta-analysis.

    PubMed

    Mitchell, B G; Dancer, S J; Anderson, M; Dehn, E

    2015-11-01

    A systematic review and meta-analysis was conducted to determine the risk of pathogen acquisition for patients associated with prior room occupancy. The analysis was also broadened to examine any differences in acquisition risk between Gram-positive and Gram-negative organisms. A search using Medline/PubMed, Cochrane and CINHAL yielded 2577 citations between 1984 and 2014. Reviews were assessed in accordance with the international prospective register of systematic reviews (PROSPERO). Just seven articles met the inclusion criteria, namely: (a) papers were peer reviewed, (b) pathogen acquisition prevalence rates were reported, (c) articles were written in English; and (d) had minimal or no risk of bias based on the Newcastle-Ottawa Scale (NOS). One study was an extension of a previous study and was discarded. Employing NOS provided little difference between the studies, with five studies receiving eight-star and two studies receiving seven-star ratings, respectively. Overall, pooled acquisition odds ratio for study pathogens (meticillin-resistant Staphylococcus aureus; vancomycin-resistant enterococcus; Clostridium difficile; acinetobacter; extended-spectrum β-lactamase-producing coliforms; pseudomonas) was 2.14 [95% confidence interval (CI): 1.65-2.77]. When comparing data between Gram-positive and Gram-negative organisms, the pooled acquisition odds ratio for Gram-negatives was 2.65 (95% CI: 2.02-3.47) and 1.89 (95% CI: 1.62-2.21) for Gram positives. The findings have important implications for infection control professionals, environmental cleaning services and patients, since current practices fail to adequately reduce acquisition risk. Although there may be non-preventable sources of acquisition, revised practices require collaborative work between all responsible staff in order to reduce this risk to a minimum. PMID:26365827

  13. A Stylistic Approach to Foreign Language Acquisition and Literary Analysis.

    ERIC Educational Resources Information Center

    Berg, William J.; Martin-Berg, Laurey K.

    This paper discusses an approach to teaching third college year "bridge" courses, showing that students in a course that focuses on language and culture as well as students in an introductory course on literary analysis can benefit from using a stylistic approach to literacy texts to understand both form and content. The paper suggests that a…

  14. Time series analysis of knowledge of results effects during motor skill acquisition.

    PubMed

    Blackwell, J R; Simmons, R W; Spray, J A

    1991-03-01

    Time series analysis was used to investigate the hypothesis that during acquisition of a motor skill, knowledge of results (KR) information is used to generate a stable internal referent about which response errors are randomly distributed. Sixteen subjects completed 50 acquisition trials of each of three movements whose spatial-temporal characteristics differed. Acquisition trials were either blocked, with each movement being presented in series, or randomized, with the presentation of movements occurring in random order. Analysis of movement time data indicated the contextual interference effect reported in previous studies was replicated in the present experiment. Time series analysis of the acquisition trial data revealed the majority of individual subject response patterns during blocked trials were best described by a model with a temporarily stationary, internal reference of the criterion and systematic, trial-to-trial variation of response errors. During random trial conditions, response patterns were usually best described by a "White-noise" model. This model predicts a permanently stationary, internal reference associated with randomly distributed response errors that are unaffected by KR information. These results are not consistent with previous work using time series analysis to describe motor behavior (Spray & Newell, 1986). PMID:2028084

  15. Digitizing data acquisition and time-of-flight pulse processing for ToF-ERDA

    NASA Astrophysics Data System (ADS)

    Julin, Jaakko; Sajavaara, Timo

    2016-01-01

    A versatile system to capture and analyze signals from multi channel plate (MCP) based time-of-flight detectors and ionization based energy detectors such as silicon diodes and gas ionization chambers (GIC) is introduced. The system is based on commercial digitizers and custom software. It forms a part of a ToF-ERDA spectrometer, which has to be able to detect recoil atoms of many different species and energies. Compared to the currently used analogue electronics the digitizing system provides comparable time-of-flight resolution and improved hydrogen detection efficiency, while allowing the operation of the spectrometer be studied and optimized after the measurement. The hardware, data acquisition software and digital pulse processing algorithms to suit this application are described in detail.

  16. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; Slimko, Eric; Limonadi, Daniel

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  17. SAPS—An automated and networked seismological acquisition and processing system

    NASA Astrophysics Data System (ADS)

    Oncescu, Mihnea Corneliu; Rizescu, Mihaela; Bonjer, Klaus-Peter

    1996-02-01

    A PC-based digital data acquisition and processing system was developed and implemented on two PCs linked by a peer-to-peer LAN. Sixteen channels are sampled with a rate of 200 Hz. The acquisition is performed continuously in sequenced files on one PC using the IASPEI-released XRTP software. The length of the elementary files is adjustable; we used 90 sec in this application. The second PC runs a program to organize automatically the following processing steps: (i) moving the raw data from the first to the second PC; (ii) filtering the data for running a 'Rex Allen'-like picker for P waves on each elementary file; (iii) concatenating three consecutive elementary files if the detection criteria are fulfilled; (v) decoding a fast time code (Lennartz-style); (v) discriminating between local and teleseismic events; (vi) plane-wave method location and mb determination for teleseisms; (vii) picking S waves, determining coda duration and locating local events; (viii) conversion of PC-SUDS into GSE format and 'feeding' a Data Request Manager with phases, locations and waveforms; (ix) sending phases and location, via e-mail, minutes after detection, and a 'health status' every hour, to the system manager; (x) plotting the raw data, the picks and printing the location results; and (xi) archiving data and results locally and on a remote workstation. The system has been running since April 1994 with data from the telemetered network of the Upper Rhinegraben. Being modular, the system can be extended and upgraded easily. Loss of data is avoided by using large hard disks as temporary data buffers and file mirroring on different hard disk drives.

  18. Web-based data acquisition and management system for GOSAT validation Lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra N.; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2012-11-01

    An web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data analysis is developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS ground-level meteorological data, Rawinsonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data.

  19. Acquisition and analysis of primate physiologic data for the Space Shuttle

    NASA Astrophysics Data System (ADS)

    Eberhart, Russell C.; Hogrefe, Arthur F.; Radford, Wade E.; Sanders, Kermit H.; Dobbins, Roy W.

    1988-03-01

    This paper describes the design and prototypes of the Physiologic Acquisition and Telemetry System (PATS), which is a multichannel system, designed for large primates, for the data acquisition, telemetry, storage, and analysis of physiological data. PATS is expected to acquire data from units implanted in the abdominal cavities of rhesus monkeys that will be flown aboard the Spacelab. The system will telemeter both stored and real-time internal physiologic measurements to an external Flight Support Station (FSS) computer system. The implanted Data Acquition and Telemetry Subsystem subunit will be externally activated, controlled and reprogrammed from the FSS.

  20. Spectral analysis for automated exploration and sample acquisition

    NASA Astrophysics Data System (ADS)

    Eberlein, Susan; Yates, Gigi

    1992-05-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  1. Spectral analysis for automated exploration and sample acquisition

    NASA Technical Reports Server (NTRS)

    Eberlein, Susan; Yates, Gigi

    1992-01-01

    Future space exploration missions will rely heavily on the use of complex instrument data for determining the geologic, chemical, and elemental character of planetary surfaces. One important instrument is the imaging spectrometer, which collects complete images in multiple discrete wavelengths in the visible and infrared regions of the spectrum. Extensive computational effort is required to extract information from such high-dimensional data. A hierarchical classification scheme allows multispectral data to be analyzed for purposes of mineral classification while limiting the overall computational requirements. The hierarchical classifier exploits the tunability of a new type of imaging spectrometer which is based on an acousto-optic tunable filter. This spectrometer collects a complete image in each wavelength passband without spatial scanning. It may be programmed to scan through a range of wavelengths or to collect only specific bands for data analysis. Spectral classification activities employ artificial neural networks, trained to recognize a number of mineral classes. Analysis of the trained networks has proven useful in determining which subsets of spectral bands should be employed at each step of the hierarchical classifier. The network classifiers are capable of recognizing all mineral types which were included in the training set. In addition, the major components of many mineral mixtures can also be recognized. This capability may prove useful for a system designed to evaluate data in a strange environment where details of the mineral composition are not known in advance.

  2. New acquisition techniques and statistical analysis of bubble size distributions

    NASA Astrophysics Data System (ADS)

    Proussevitch, A.; Sahagian, D.

    2005-12-01

    Various approaches have been taken to solve the long-standing problem of determining size distributions of objects embedded in an opaque medium. In the case of vesicles in volcanic rocks, the most reliable technique is 3-D imagery by computed X-Ray tomography. However, this method is expensive, requires intensive computational resources and thus limited and not always available for an investigator. As a cheaper alternative, 2-D cross-sectional data is commonly available, but requires stereological analysis for 3-D conversion. A stereology technique for spherical bubbles is quite robust but elongated non-spherical bubbles require complicated conversion approaches and large observed populations. We have revised computational schemes of applying non-spherical stereology for practical analysis of bubble size distributions. The basic idea of this new approach is to exclude from the conversion those classes (bins) of non-spherical bubbles that provide a larger cross-section probability distribution than a maximum value which depends on mean aspect ratio. Thus, in contrast to traditional stereological techniques, larger bubbles are "predicted" from the rest of the population. As a proof of principle, we have compared distributions so obtained with direct 3-D imagery (X-Ray tomography) for non-spherical bubbles from the same samples of vesicular basalts collected from the Colorado Plateau. The results of the comparison demonstrate that in cases where x-ray tomography is impractical, stereology can be used with reasonable reliability, even for non-spherical vesicles.

  3. Touch sensing analysis using multi-modal acquisition system

    NASA Astrophysics Data System (ADS)

    King, Jeffrey S.; Pikula, Dragan; Baharav, Zachi

    2013-03-01

    Touch sensing is ubiquitous in many consumer electronic products. Users are expecting to be able to touch with their finger the surface of a display and interact with it. Yet, the actual mechanics and physics of the touch process are little known, as these are dependent on many independent variables. Ranging from the physics of the fingertip structure, composed of ridges, valleys, and pores, and beyond a few layers of skin and flesh the bone itself. Moreover, sweat glands and wetting are critical as well as we will see. As for the mechanics, the pressure at which one touches the screen, and the manner by which the surfaces responds to this pressure, have major impact on the touch sensing. In addition, different touch sensing methods, like capacitive or optical, will have different dependencies. For example, the color of the finger might impact the latter, whereas the former is insensitive to it. In this paper we describe a system that captures multiple modalities of the touch event, and by post-processing synchronizing all these. This enables us to look for correlation between various effects, and uncover their influence on the performance of the touch sensing algorithms. Moreover, investigating these relations allows us to improve various sensing algorithms, as well as find areas where they complement each other. We conclude by pointing to possible future extensions and applications of this system.

  4. The collection and analysis of transient test data using the mobile instrumentation data acquisition system (MIDAS)

    SciTech Connect

    Uncapher, W.L.; Arviso, M.

    1995-12-31

    Packages designed to transport radioactive materials are required to survive exposure to environments defined in Code of Federal Regulations. Cask designers can investigate package designs through structural and thermal testing of full-scale packages, components, or representative models. The acquisition of reliable response data from instrumentation measurement devices is an essential part of this testing activity. Sandia National Laboratories, under the sponsorship of the US Department of Energy (DOE), has developed the Mobile Instrumentation Data Acquisition System (MIDAS) dedicated to the collection and processing of structural and thermal data from regulatory tests.

  5. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  6. Automated acquisition and analysis of airway surface liquid height by confocal microscopy

    PubMed Central

    Choi, Hyun-Chul; Kim, Christine Seul Ki

    2015-01-01

    The airway surface liquid (ASL) is a thin-liquid layer that lines the luminal side of airway epithelia. ASL contains many molecules that are involved in primary innate defense in the lung. Measurement of ASL height on primary airway cultures by confocal microscopy is a powerful tool that has enabled researchers to study ASL physiology and pharmacology. Previously, ASL image acquisition and analysis were performed manually. However, this process is time and labor intensive. To increase the throughput, we have developed an automatic ASL measurement technique that combines a fully automated confocal microscope with novel automatic image analysis software that was written with image processing techniques derived from the computer science field. We were able to acquire XZ ASL images at the rate of ∼1 image/s in a reproducible fashion. Our automatic analysis software was able to analyze images at the rate of ∼32 ms/image. As proofs of concept, we generated a time course for ASL absorption and a dose response in the presence of SPLUNC1, a known epithelial sodium channel inhibitor, on human bronchial epithelial cultures. Using this approach, we determined the IC50 for SPLUNC1 to be 6.53 μM. Furthermore, our technique successfully detected a difference in ASL height between normal and cystic fibrosis (CF) human bronchial epithelial cultures and detected changes in ATP-stimulated Cl−/ASL secretion. We conclude that our automatic ASL measurement technique can be applied for repeated ASL height measurements with high accuracy and consistency and increased throughput. PMID:26001773

  7. Dynamic analysis of process reactors

    SciTech Connect

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  8. How does the interaction between spelling and motor processes build up during writing acquisition?

    PubMed

    Kandel, Sonia; Perret, Cyril

    2015-03-01

    How do we recall a word's spelling? How do we produce the movements to form the letters of a word? Writing involves several processing levels. Surprisingly, researchers have focused either on spelling or motor production. However, these processes interact and cannot be studied separately. Spelling processes cascade into movement production. For example, in French, producing letters PAR in the orthographically irregular word PARFUM (perfume) delays motor production with respect to the same letters in the regular word PARDON (pardon). Orthographic regularity refers to the possibility of spelling a word correctly by applying the most frequent sound-letter conversion rules. The present study examined how the interaction between spelling and motor processing builds up during writing acquisition. French 8-10 year old children participated in the experiment. This is the age handwriting skills start to become automatic. The children wrote regular and irregular words that could be frequent or infrequent. They wrote on a digitizer so we could collect data on latency, movement duration and fluency. The results revealed that the interaction between spelling and motor processing was present already at age 8. It became more adult-like at ages 9 and 10. Before starting to write, processing irregular words took longer than regular words. This processing load spread into movement production. It increased writing duration and rendered the movements more dysfluent. Word frequency affected latencies and cascaded into production. It modulated writing duration but not movement fluency. Writing infrequent words took longer than frequent words. The data suggests that orthographic regularity has a stronger impact on writing than word frequency. They do not cascade in the same extent. PMID:25525970

  9. 77 FR 9617 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... WorkFlow to process vouchers. DATES: Comments on the proposed rule published January 19, 2012, at 77 FR... clarifying the proposed rule published on January 19, 2012 (77 FR 2682), which proposes to revise... Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054) AGENCY: Defense Acquisition...

  10. On the acquisition and analysis of microscale thermophoresis data.

    PubMed

    Scheuermann, Thomas H; Padrick, Shae B; Gardner, Kevin H; Brautigam, Chad A

    2016-03-01

    A comprehensive understanding of the molecular mechanisms underpinning cellular functions is dependent on a detailed characterization of the energetics of macromolecular binding, often quantified by the equilibrium dissociation constant, KD. While many biophysical methods may be used to obtain KD, the focus of this report is a relatively new method called microscale thermophoresis (MST). In an MST experiment, a capillary tube filled with a solution containing a dye-labeled solute is illuminated with an infrared laser, rapidly creating a temperature gradient. Molecules will migrate along this gradient, causing changes in the observed fluorescence. Because the net migration of the labeled molecules will depend on their liganded state, a binding curve as a function of ligand concentration can be constructed from MST data and analyzed to determine KD. Herein, simulations demonstrate the limits of KD that can be measured in current instrumentation. They also show that binding kinetics is a major concern in planning and executing MST experiments. Additionally, studies of two protein-protein interactions illustrate challenges encountered in acquiring and analyzing MST data. Combined, these approaches indicate a set of best practices for performing and analyzing MST experiments. Software for rigorous data analysis is also introduced. PMID:26739938

  11. Acquisition and analysis of angle-beam wavefield data

    SciTech Connect

    Dawson, Alexander J.; Michaels, Jennifer E.; Levine, Ross M.; Chen, Xin; Michaels, Thomas E.

    2014-02-18

    Angle-beam ultrasonic testing is a common practical technique used for nondestructive evaluation to detect, locate, and characterize a variety of material defects and damage. Greater understanding of the both the incident wavefield produced by an angle-beam transducer and the subsequent scattering from a variety of defects and geometrical features is anticipated to increase the reliability of data interpretation. The focus of this paper is on acquiring and analyzing propagating waves from angle-beam transducers in simple, defect-free plates as a first step in the development of methods for flaw characterization. Unlike guided waves, which excite the plate throughout its thickness, angle-beam bulk waves bounce back and forth between the plate surfaces, resulting in the well-known multiple “skips” or “V-paths.” The experimental setup consists of a laser vibrometer mounted on an XYZ scanning stage, which is programmed to move point-to-point on a rectilinear grid to acquire waveform data. Although laser vibrometry is now routinely used to record guided waves for which the frequency content is below 1 MHz, it is more challenging to acquire higher frequency bulk waves in the 1–10 MHz range. Signals are recorded on the surface of an aluminum plate that were generated from a 5 MHz, 65° refracted angle, shear wave transducer-wedge combination. Data are analyzed directly in the x-t domain, via a slant stack Radon transform in the τ-p (offset time-slowness) domain, and via a 2-D Fourier transform in the ω-k domain, thereby enabling identification of specific arrivals and modes. Results compare well to those expected from a simple ray tracing analysis except for the unexpected presence of a strong Rayleigh wave.

  12. Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.

  13. How human resource organization can enhance space information acquisition and processing: the experience of the VENESAT-1 ground segment

    NASA Astrophysics Data System (ADS)

    Acevedo, Romina; Orihuela, Nuris; Blanco, Rafael; Varela, Francisco; Camacho, Enrique; Urbina, Marianela; Aponte, Luis Gabriel; Vallenilla, Leopoldo; Acuña, Liana; Becerra, Roberto; Tabare, Terepaima; Recaredo, Erica

    2009-12-01

    Built in cooperation with the P.R of China, in October 29th of 2008, the Bolivarian Republic of Venezuela launched its first Telecommunication Satellite, the so called VENESAT-1 (Simón Bolívar Satellite), which operates in C (covering Center America, The Caribbean Region and most of South America), Ku (Bolivia, Cuba, Dominican Republic, Haiti, Paraguay, Uruguay, Venezuela) and Ka bands (Venezuela). The launch of VENESAT-1 represents the starting point for Venezuela as an active player in the field of space science and technology. In order to fulfill mission requirements and to guarantee the satellite's health, local professionals must provide continuous monitoring, orbit calculation, maneuvers preparation and execution, data preparation and processing, as well as data base management at the VENESAT-1 Ground Segment, which includes both a primary and backup site. In summary, data processing and real time data management are part of the daily activities performed by the personnel at the ground segment. Using published and unpublished information, this paper presents how human resource organization can enhance space information acquisition and processing, by analyzing the proposed organizational structure for the VENESAT-1 Ground Segment. We have found that the proposed units within the organizational structure reflect 3 key issues for mission management: Satellite Operations, Ground Operations, and Site Maintenance. The proposed organization is simple (3 hierarchical levels and 7 units), and communication channels seem efficient in terms of facilitating information acquisition, processing, storage, flow and exchange. Furthermore, the proposal includes a manual containing the full description of personnel responsibilities and profile, which efficiently allocates the management and operation of key software for satellite operation such as the Real-time Data Transaction Software (RDTS), Data Management Software (DMS), and Carrier Spectrum Monitoring Software (CSM

  14. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    SciTech Connect

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  15. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGESBeta

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  16. Reverse engineering of the multiple launch rocket system. Human factors, manpower, personnel, and training in the weapons system acquisition process

    NASA Astrophysics Data System (ADS)

    Arabian, J. M.; Hartel, C. R.; Kaplan, J. D.; Marcus, A.; Promisel, D. M.

    1984-06-01

    In a briefing format, this report on the Multiple Launch Rocket System summarizes an examination of human factors, manpower, personnel and training (HMPT) issues during the systems acquisition process. The report is one of four reverse engineering studies prepared at the request of Gen. M. R. Thurman, Army Vice Chief of Staff. The four systems were studied as a representative sample of Army weapons systems. They serve as the basis for drawing conclusions about aspects of the weapons system acquisition process which most affect HMPT considerations. A synthesis of the four system studies appears in the final report of the Reverse Engineering Task Force U.S. Army Research Institute.

  17. Standardization of infrared breast thermogram acquisition protocols and abnormality analysis of breast thermograms

    NASA Astrophysics Data System (ADS)

    Bhowmik, Mrinal Kanti; Gogoi, Usha Rani; Das, Kakali; Ghosh, Anjan Kumar; Bhattacharjee, Debotosh; Majumdar, Gautam

    2016-05-01

    The non-invasive, painless, radiation-free and cost-effective infrared breast thermography (IBT) makes a significant contribution to improving the survival rate of breast cancer patients by early detecting the disease. This paper presents a set of standard breast thermogram acquisition protocols to improve the potentiality and accuracy of infrared breast thermograms in early breast cancer detection. By maintaining all these protocols, an infrared breast thermogram acquisition setup has been established at the Regional Cancer Centre (RCC) of Government Medical College (AGMC), Tripura, India. The acquisition of breast thermogram is followed by the breast thermogram interpretation, for identifying the presence of any abnormality. However, due to the presence of complex vascular patterns, accurate interpretation of breast thermogram is a very challenging task. The bilateral symmetry of the thermal patterns in each breast thermogram is quantitatively computed by statistical feature analysis. A series of statistical features are extracted from a set of 20 thermograms of both healthy and unhealthy subjects. Finally, the extracted features are analyzed for breast abnormality detection. The key contributions made by this paper can be highlighted as -- a) the designing of a standard protocol suite for accurate acquisition of breast thermograms, b) creation of a new breast thermogram dataset by maintaining the protocol suite, and c) statistical analysis of the thermograms for abnormality detection. By doing so, this proposed work can minimize the rate of false findings in breast thermograms and thus, it will increase the utilization potentiality of breast thermograms in early breast cancer detection.

  18. Proposed military handbook for dynamic data acquisition and analysis - An invitation to review

    NASA Technical Reports Server (NTRS)

    Himelblau, Harry; Wise, James H.; Piersol, Allan G.; Grundvig, Max R.

    1990-01-01

    A draft Military Handbook prepared under the sponsorship of the USAF Space Division is presently being distributed throughout the U.S. for review by the aerospace community. This comprehensive document provides recommended guidelines for the acquisition and analysis of structural dynamics and aeroacoustic data, and is intended to reduce the errors and variability commonly found in flight, ground and laboratory dynamic test measurements. In addition to the usual variety of measurement problems encountered in the definition of dynamic loads, the development of design and test criteria, and the analysis of failures, special emphasis is given to certain state-of-the-art topics, such as pyroshock data acquisition and nonstationary random data analysis.

  19. Detailed design of the GOSAT DHF at NIES and data acquisition/processing/distribution strategy

    NASA Astrophysics Data System (ADS)

    Watanabe, Hiroshi; Ishihara, Hironari; Hayashi, Kenji; Kawazoe, Fumie; Kikuchi, Nobuyuki; Eguchi, Nawo; Matsunaga, Tsuneo; Yokota, Tatsuya

    2008-10-01

    GOSAT Project (GOSAT stands for Greenhouse gases Observation SATellite) is a joint project of MOE (Ministry of the Environment), JAXA (Japan Aerospace Exploration Agency) and NIES (National Institute for Environmental Studies (NIES). Data acquired by TANSO-FTS (Fourier Transform Spectrometer) and TANSO-CAI (Cloud and Aerosol Imager) on GOSAT (TANSO stands for Thermal And Near infrared Sensor for carbon Observation) will be collected at Tsukuba Space Center @ JAXA. The level 1A and 1B data of FTS (interferogram and spectra, respectively) and the level 1A of CAI (uncorrected data) will be generated at JAXA and will be transferred to GOSAT Data Handling facility (DHF) at NIES for further processing. Radiometric and geometric correction will be applied to CAI L1A data to generate CAI L1B data. From CAI L1B data, cloud coverage and aerosol information (CAI Level 2 data) will be estimated. The FTS data that is recognized to have "low cloud coverage" by CAI will be processed to generate column concentration of carbon dioxide CO2 and methane CH4 (FTS Level 2 data). Level 3 data will be "global map column concentration" of green house gases averaged in time and space. Level 4 data will be global distribution of carbon source/sink model and re-calculated forward model estimated by inverse model. Major data flow will be also described. The Critical Design Review (CDR) of the DHF was completed in early July of 2007 to prepare the scheduled launch of GOSAT in early 2009. In this manuscript, major changes after the CDR are discussed. In addition, data acquisition scenario by FTS is also discussed. The data products can be searched and will be open to the public through GOSAT DHF after the data validation process. Data acquisition plan is also discussed and the discussion will cover lattice point observation for land area, and sun glint observation over water area. The Principal Investigators who submitted a proposal for Research Announcement will have a chance to request the

  20. A Pooled Analysis of the Effect of Condoms in Preventing HSV-2 Acquisition

    PubMed Central

    Martin, Emily T.; Krantz, Elizabeth; Gottlieb, Sami L.; Magaret, Amalia S.; Langenberg, Andria; Stanberry, Lawrence; Kamb, Mary; Wald, Anna

    2010-01-01

    Background The degree of effectiveness of condoms in preventing the transmission of herpes simplex virus 2 (HSV-2), is uncertain. We performed a large pooled analysis to address this question. Methods We identified prospective studies with individual-level condom use data and laboratory-defined HSV-2 acquisition. Six studies were identified through a review of publications through 2007: three candidate HSV-2 vaccine studies, an HSV-2 drug study, an observational STD incidence study and a behavioral STD intervention study. Study investigators provided us individual-level data to perform a pooled analysis. Effect of condom use was modeled using a continuous percent of sex acts during which a condom was used and, alternatively, using absolute number of unprotected sex acts. Results A total of 5384 people who were HSV-2 negative at baseline contributed 2,040,894 follow-up days. 415 persons acquired laboratory-documented HSV-2 during follow-up. Consistent (100%) condom users had a 30% lower risk of HSV-2 acquisition compared to those who never used condoms (hazard ratio: 0.70; 95 percent confidence interval: 0.40, 0.94; p = 0.01). Risk for HSV-2 acquisition increased steadily and significantly with each unprotected sex act (hazard ratio: 1.16; 95 percent confidence interval: 1.08, 1.25); p < 0.001). Condom effectiveness did not vary by gender. Conclusions This is the largest analysis using prospective data to assess the effect of condoms in preventing HSV-2 acquisition. Although the magnitude of protection was not as large as has been observed with other STIs, we found that condoms offer moderate protection against HSV-2 acquisition in men and women. PMID:19597073

  1. Identification of phosphorus deficiency responsive proteins in a high phosphorus acquisition soybean (Glycine max) cultivar through proteomic analysis.

    PubMed

    Sha, Aihua; Li, Ming; Yang, Pingfang

    2016-05-01

    As one of the major oil crops, soybean might be seriously affected by phosphorus deficiency on both yield and quality. Understanding the molecular basis of phosphorus uptake and utilization in soybean may help to develop phosphorus (P) efficient cultivars. On this purpose, we conducted a comparative proteomic analysis on a high P acquisition soybean cultivar BX10 under low and high P conditions. A total of 61 unique proteins were identified as putative P deficiency responsive proteins. These proteins were involved in carbohydrate metabolism, protein biosynthesis/processing, energy metabolism, cellular processes, environmental defense/interaction, nucleotide metabolism, signal transduction, secondary metabolism and other metabolism related processes. Several proteins involved in energy metabolism, cellular processes, and protein biosynthesis and processing were found to be up-regulated in both shoots and roots, whereas, proteins involved in carbohydrate metabolism appeared to be down-regulated. These proteins are potential candidates for improving P acquisition. These findings provide a useful starting point for further research that will provide a more comprehensive understanding of molecular mechanisms through which soybeans adapt to P deficiency condition. PMID:26853500

  2. Data acquisition and analysis of the UNCOSS underwater explosive neutron sensor

    SciTech Connect

    Carasco, Cedric; Eleon, Cyrille; Perot, Bertrand; Boudergui, Karim; Kondrasovs, Vladimir; Corre, Gwenole; Normand, Stephane; Sannie, Guillaume; Woo, Romuald; Bourbotte, Jean-Michel

    2012-08-15

    The purpose of the FP7 UNCOSS project (Underwater Coastal Sea Surveyor, http://www.uncoss-project.org) is to develop a neutron-based underwater explosive sensor to detect unexploded ordnance lying on the sea bottom. The Associated Particle Technique is used to focus the inspection on a suspicious object located by optical and electromagnetic sensors and to determine if there is an explosive charge inside. This paper presents the data acquisition electronics and data analysis software which have been developed for this project. A field programmable gate array that digitizes and processes the signal allows to perform precise time-of-flight and gamma-ray energy measurements. The gamma-ray spectra are unfolded into pure elemental count proportions, mainly C, N, O, Fe, Al, Si, and Ca. The C, N, and O count fractions are converted into chemical proportions, taking into account the gamma-ray production cross sections, as well as neutron and photon attenuation in the different shields between the ROV (Remotely Operated Vehicle) and the explosive, such as the explosive iron shell, seawater, and ROV envelop. A two-dimensional (2D) barycentric representation of the C, N, and O proportions is built from their chemical ratios, and a 2D likelihood map is built from the associated statistical and systematic uncertainties. The threat level is evaluated from the best matching materials of a database including explosives. (authors)

  3. Process analysis using ion mobility spectrometry.

    PubMed

    Baumbach, J I

    2006-03-01

    Ion mobility spectrometry, originally used to detect chemical warfare agents, explosives and illegal drugs, is now frequently applied in the field of process analytics. The method combines both high sensitivity (detection limits down to the ng to pg per liter and ppb(v)/ppt(v) ranges) and relatively low technical expenditure with a high-speed data acquisition. In this paper, the working principles of IMS are summarized with respect to the advantages and disadvantages of the technique. Different ionization techniques, sample introduction methods and preseparation methods are considered. Proven applications of different types of ion mobility spectrometer (IMS) used at ISAS will be discussed in detail: monitoring of gas insulated substations, contamination in water, odoration of natural gas, human breath composition and metabolites of bacteria. The example applications discussed relate to purity (gas insulated substations), ecology (contamination of water resources), plants and person safety (odoration of natural gas), food quality control (molds and bacteria) and human health (breath analysis). PMID:16132133

  4. Seismic acquisition and processing methodologies in overthrust areas: Some examples from Latin America

    SciTech Connect

    Tilander, N.G.; Mitchel, R..

    1996-08-01

    Overthrust areas represent some of the last frontiers in petroleum exploration today. Billion barrel discoveries in the Eastern Cordillera of Colombia and the Monagas fold-thrust belt of Venezuela during the past decade have highlighted the potential rewards for overthrust exploration. However the seismic data recorded in many overthrust areas is disappointingly poor. Challenges such as rough topography, complex subsurface structure, presence of high-velocity rocks at the surface, back-scattered energy and severe migration wavefronting continue to lower data quality and reduce interpretability. Lack of well/velocity control also reduces the reliability of depth estimations and migrated images. Failure to obtain satisfactory pre-drill structural images can easily result in costly wildcat failures. Advances in the methodologies used by Chevron for data acquisition, processing and interpretation have produced significant improvements in seismic data quality in Bolivia, Colombia and Trinidad. In this paper, seismic test results showing various swath geometries will be presented. We will also show recent examples of processing methods which have led to improved structural imaging. Rather than focusing on {open_quotes}black box{close_quotes} methodology, we will emphasize the cumulative effect of step-by-step improvements. Finally, the critical significance and interrelation of velocity measurements, modeling and depth migration will be explored. Pre-drill interpretations must ultimately encompass a variety of model solutions, and error bars should be established which realistically reflect the uncertainties in the data.

  5. A generic model for data acquisition: Connectionist methods of information processing

    NASA Astrophysics Data System (ADS)

    Ehrlich, Jacques

    1993-06-01

    EDDAKS (Event Driven Data Acquisition Kernel System), for the quality control of products created in industrial production processes, is proposed. It is capable of acquiring information about discrete event systems by synchronizing to them via the events. EDDAKS consists of EdObjects, forming a hierarchy, which react to EdEvents, and perform processing operations on messages. The hierarchy of EdObjects consists (from bottom up) of the Sensor, the Phase, the Extracter, the Dynamic Spreadsheet, and EDDAKS itself. The first three levels contribute to building the internal representation: a state vector characterizing a product in the course of production. The Dynamic Spreadsheet, is a processing structure that can be parameterized, used to perform calculations on a set of internal representations in order to deliver the external representation to the user. A system intended for quality control of the products delivered by a concrete production plant was generated by EDDAKS and used to validate. Processing methods using the multilayer perceptron model were considered. Two contributions aimed at improving the performance of this network are proposed. One consists of implanting a conjugate gradient method. The effectiveness of this method depends on the determination of an optimum gradient step that is efficiently calculated by a linear search using a secant algorithm. The other is intended to reduce the connectivity of the network by adapting it to the problem to be solved. It consists of identifying links having little or no activity and destroying them. This activity is determined by evaluating the covariance between each of the inputs of a cell and its output. An experiment in which nonlinear prediction is applied to a civil engineering problem is described.

  6. Real-time multilevel process monitoring and control of CR image acquisition and preprocessing for PACS and ICU

    NASA Astrophysics Data System (ADS)

    Zhang, Jianguo; Wong, Stephen T. C.; Andriole, Katherine P.; Wong, Albert W. K.; Huang, H. K.

    1996-05-01

    The purpose of this paper is to present a control theory and a fault tolerance algorithm developed for real time monitoring and control of acquisition and preprocessing of computed radiographs for PACS and Intensive Care Unit operations. This monitoring and control system uses the event-driven, multilevel processing approach to remove computational bottleneck and to improve system reliability. Its computational performance and processing reliability are evaluated and compared with those of the traditional, single level processing approach.

  7. Vocational Education Operations Analysis Process.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Vocational Education Services.

    This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…

  8. Bacterial vaginosis and HIV acquisition: A meta-analysis of published studies

    PubMed Central

    Atashili, Julius; Poole, Charles; Ndumbe, Peter M; Adimora, Adaora A.; Smith, Jennifer S.

    2009-01-01

    Objectives To assess and summarize the published literature on the extent to which bacterial vaginosis (BV) may increase the risk of HIV acquisition. Design Meta-analysis of published studies. Methods MEDLINE and other electronic databases were systematically searched for eligible publications. The association between BV and incident HIV was separately analyzed from that between BV and prevalent HIV. The latter were further analyzed stratified by BV diagnostic method, HIV risk profile of the study population and whether or not adjusted estimates were presented. Results Twenty-three eligible publications were identified, including a total of 30,739 women. BV was associated with an increased risk of HIV acquisition in HIV-incidence studies (relative risk = 1.6, 95% CI: 1.2, 2.1). All but one of 21 HIV-prevalence studies reported estimates above the null. The latter results were heterogeneous and showed some evidence of funnel plot asymmetry, precluding the estimation of a single summary measure. The association between BV and HIV in prevalence studies appeared stronger for women without high-risk sexual behavior. Conclusions BV was consistently associated with an increased risk of HIV infection. High BV prevalence may result in a high number of HIV infections being attributable to BV. More prospective studies are needed to accurately evaluate the role of BV in HIV acquisition in low versus high risk women. Furthermore, randomized clinical trials may be worth considering to determine the effect of BV control measures on HIV acquisition. PMID:18614873

  9. In-Depth Analysis of Computer Memory Acquisition Software for Forensic Purposes.

    PubMed

    McDown, Robert J; Varol, Cihan; Carvajal, Leonardo; Chen, Lei

    2016-01-01

    The comparison studies on random access memory (RAM) acquisition tools are either limited in metrics or the selected tools were designed to be executed in older operating systems. Therefore, this study evaluates widely used seven shareware or freeware/open source RAM acquisition forensic tools that are compatible to work with the latest 64-bit Windows operating systems. These tools' user interface capabilities, platform limitations, reporting capabilities, total execution time, shared and proprietary DLLs, modified registry keys, and invoked files during processing were compared. We observed that Windows Memory Reader and Belkasoft's Live Ram Capturer leaves the least fingerprints in memory when loaded. On the other hand, ProDiscover and FTK Imager perform poor in memory usage, processing time, DLL usage, and not-wanted artifacts introduced to the system. While Belkasoft's Live Ram Capturer is the fastest to obtain an image of the memory, Pro Discover takes the longest time to do the same job. PMID:27405017

  10. Developmental trends in auditory processing can provide early predictions of language acquisition in young infants.

    PubMed

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R; Shao, Jie; Lozoff, Betsy

    2013-03-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory - Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  11. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    PubMed Central

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2012-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with both Auditory Brainstem Response (ABR) and language assessments. At 6 weeks and/or 9 months of age, the infants underwent ABR testing using both a standard hearing screening protocol with 30 dB clicks and a second protocol using click pairs separated by 8, 16, and 64-ms intervals presented at 80 dB. We evaluated the effects of interval duration on ABR latency and amplitude elicited by the second click. At 9 months, language development was assessed via parent report on the Chinese Communicative Development Inventory – Putonghua version (CCDI-P). Wave V latency z-scores of the 64-ms condition at 6 weeks showed strong direct relationships with Wave V latency in the same condition at 9 months. More importantly, shorter Wave V latencies at 9 months showed strong relationships with the CCDI-P composite consisting of phrases understood, gestures, and words produced. Likewise, infants who had greater decreases in Wave V latencies from 6 weeks to 9 months had higher CCDI-P composite scores. Females had higher language development scores and shorter Wave V latencies at both ages than males. Interestingly, when the ABR Wave V latencies at both ages were taken into account, the direct effects of gender on language disappeared. In conclusion, these results support the importance of low-level auditory processing capabilities for early language acquisition in a population of typically developing young infants. Moreover, the auditory brainstem response in this paradigm shows promise as an electrophysiological marker to predict individual differences in language development in young children. PMID:23432827

  12. An underground tale: contribution of microbial activity to plant iron acquisition via ecological processes

    PubMed Central

    Jin, Chong Wei; Ye, Yi Quan; Zheng, Shao Jian

    2014-01-01

    Background Iron (Fe) deficiency in crops is a worldwide agricultural problem. Plants have evolved several strategies to enhance Fe acquisition, but increasing evidence has shown that the intrinsic plant-based strategies alone are insufficient to avoid Fe deficiency in Fe-limited soils. Soil micro-organisms also play a critical role in plant Fe acquisition; however, the mechanisms behind their promotion of Fe acquisition remain largely unknown. Scope This review focuses on the possible mechanisms underlying the promotion of plant Fe acquisition by soil micro-organisms. Conclusions Fe-deficiency-induced root exudates alter the microbial community in the rhizosphere by modifying the physicochemical properties of soil, and/or by their antimicrobial and/or growth-promoting effects. The altered microbial community may in turn benefit plant Fe acquisition via production of siderophores and protons, both of which improve Fe bioavailability in soil, and via hormone generation that triggers the enhancement of Fe uptake capacity in plants. In addition, symbiotic interactions between micro-organisms and host plants could also enhance plant Fe acquisition, possibly including: rhizobium nodulation enhancing plant Fe uptake capacity and mycorrhizal fungal infection enhancing root length and the nutrient acquisition area of the root system, as well as increasing the production of Fe3+ chelators and protons. PMID:24265348

  13. Multifunctional data acquisition and analysis and optical sensors: a Bonneville Power Administration (BPA) update

    NASA Astrophysics Data System (ADS)

    Erickson, Dennis C.; Donnelly, Matt K.

    1995-04-01

    The authors present a design concept describing a multifunctional data acquisition and analysis architecture for advanced power system monitoring. The system is tailored to take advantage of the salient features of low energy sensors, particularly optical types. The discussion of the system concept and optical sensors is based on research at BPA and PNL and on progress made at existing BPA installations and other sites in the western power system.

  14. [Design of hand-held heart rate variability acquisition and analysis system].

    PubMed

    Li, Kaiyuan; Wang, Buqing; Wang, Weidong

    2012-07-01

    A design of handheld heart rate variability acquisition and analysis system is proposed. The system collects and stores the patient's ECG every five minutes through both hands touching on the electrodes, and then -uploads data to a PC through USB port. The system uses software written in LabVIEW to analyze heart rate variability parameters, The parameters calculated function is programmed and generated to components in Matlab. PMID:23189641

  15. Fluorescence2D: Software for Accelerated Acquisition and Analysis of Two-Dimensional Fluorescence Spectra

    PubMed Central

    Kovrigin, Evgenii L.

    2014-01-01

    The Fluorescence2D is free software that allows analysis of two-dimensional fluorescence spectra obtained using the accelerated “triangular” acquisition schemes. The software is a combination of Python and MATLAB-based programs that perform conversion of the triangular data, display of the two-dimensional spectra, extraction of 1D slices at different wavelengths, and output in various graphic formats. PMID:24984078

  16. Toxic School Sites in Los Angeles: Weaknesses in the Site Acquisition Process. Special Report of the Joint Legislative Audit Committee.

    ERIC Educational Resources Information Center

    California State Legislature, Sacramento. Joint Legislative Audit Committee.

    A special report of the California Legislature's Joint Legislative Audit Committee addresses the school site acquisition process to attempt to discern how the system has allowed a minimum of nine Los Angeles public schools to be built on toxic lands. The report examines two such sites, the Jefferson Middle School (JMS) and the combined elementary…

  17. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  18. Combining Contextual and Morphemic Cues Is Beneficial during Incidental Vocabulary Acquisition: Semantic Transparency in Novel Compound Word Processing

    ERIC Educational Resources Information Center

    Brusnighan, Stephen M.; Folk, Jocelyn R.

    2012-01-01

    In two studies, we investigated how skilled readers use contextual and morphemic information in the process of incidental vocabulary acquisition during reading. In Experiment 1, we monitored skilled readers' eye movements while they silently read sentence pairs containing novel and known English compound words that were either semantically…

  19. The Comparative Effects of Processing Instruction and Dictogloss on the Acquisition of the English Passive by Speakers of Turkish

    ERIC Educational Resources Information Center

    Uludag, Onur; Vanpatten, Bill

    2012-01-01

    The current study presents the results of an experiment investigating the effects of processing instruction (PI) and dictogloss (DG) on the acquisition of the English passive voice. Sixty speakers of Turkish studying English at university level were assigned to three groups: one receiving PI, the other receiving DG and the third serving as a…

  20. The Effects of Word Exposure Frequency and Elaboration of Word Processing on Incidental L2 Vocabulary Acquisition through Reading

    ERIC Educational Resources Information Center

    Eckerth, Johannes; Tavakoli, Parveneh

    2012-01-01

    Research on incidental second language (L2) vocabulary acquisition through reading has claimed that repeated encounters with unfamiliar words and the relative elaboration of processing these words facilitate word learning. However, so far both variables have been investigated in isolation. To help close this research gap, the current study…

  1. Sensor Acquisition for Water Utilities: Survey, Down Selection Process, and Technology List

    SciTech Connect

    Alai, M; Glascoe, L; Love, A; Johnson, M; Einfeld, W

    2005-06-29

    The early detection of the biological and chemical contamination of water distribution systems is a necessary capability for securing the nation's water supply. Current and emerging early-detection technology capabilities and shortcomings need to be identified and assessed to provide government agencies and water utilities with an improved methodology for assessing the value of installing these technologies. The Department of Homeland Security (DHS) has tasked a multi-laboratory team to evaluate current and future needs to protect the nation's water distribution infrastructure by supporting an objective evaluation of current and new technologies. The LLNL deliverable from this Operational Technology Demonstration (OTD) was to assist the development of a technology acquisition process for a water distribution early warning system. The technology survey includes a review of previous sensor surveys and current test programs and a compiled database of relevant technologies. In the survey paper we discuss previous efforts by governmental agencies, research organizations, and private companies. We provide a survey of previous sensor studies with regard to the use of Early Warning Systems (EWS) that includes earlier surveys, testing programs, and response studies. The list of sensor technologies was ultimately developed to assist in the recommendation of candidate technologies for laboratory and field testing. A set of recommendations for future sensor selection efforts has been appended to this document, as has a down selection example for a hypothetical water utility.

  2. Parameter identification of process simulation models as a means for knowledge acquisition and technology transfer

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Ifanti, Konstantina

    2012-12-01

    Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.

  3. Experimental studies on remanence acquisition processes and regional geomagnetic field variability from archeointensity studies

    NASA Astrophysics Data System (ADS)

    Mitra, Ritayan

    The dissertation comprises two separate topics. Chapters 2 and 3 are experimental studies on remanence acquisition processes. Chapters 4 and 5 investigate the geomagnetic field variability in Africa and India between 1000 BCE and 1000 CE. Chapter 2 is a study in which the role of flocculation in sedimentary magnetization is analyzed with the help of laboratory redeposition experiments and a simple numerical model. At small floc sizes DRM acquisition is likely to be non-linear but it may record the directions with higher fidelity. In environments having bigger flocs the sediments are likely to record either intensities or directions with high fidelity, but not both. Also flocculation may inhibit a large fraction of magnetic grains from contributing to the net remanence and this might have consequences for intensity normalization in sediments. Chapter 3 presents a fresh perspective on the long standing debate of the nature of magnetocrystalline anisotropy in Mid-Ocean Ridge Basalts (MORBs). A new parameter, IRAT, defined as the ratio of the isothermal remanences in antiparallel directions is used to differentiate between uniaxial single domain grains (IRAT ˜1) and multiaxial single domain grains (IRAT<1). The theoretical predictions were first validated with standard samples and then multiple MORB samples were analyzed. The observed IRAT ratios indicate a dominant non-uniaxial anisotropy in the MORBs. Chapters 4 and 5 are archeointensity studies from two data poor regions of the world viz., Africa and India. With stringent data selection criteria and well established archeological constraints these datasets provide important constraints on the field intensity from 1000 BCE to 1000 CE in Africa and 500 BCE to 1000 CE in India. The African dataset has a higher age resolution than the Indian dataset. The African dataset matches well with the global CALS3k.4 model and shows significant non-axial-dipolar contribution in the region. The Indian dataset is not of a similar

  4. Explaining the "Natural Order of L2 Morpheme Acquisition" in English: A Meta-Analysis of Multiple Determinants

    ERIC Educational Resources Information Center

    Goldschneider, Jennifer M.; DeKeyser, Robert M.

    2005-01-01

    This meta-analysis pools data from 25 years of research on the order of acquisition of English grammatical morphemes by students of English as a second language (ESL). Some researchers have posited a "natural" order of acquisition common to all ESL learners, but no single cause has been shown for this phenomenon. Our study investigated whether a…

  5. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA

  6. System design, development, and production process modeling: A versatile and powerful acquisition management decision support tool

    SciTech Connect

    Rafuse, H.E.

    1996-12-31

    A series of studies have been completed on the manufacturing operations of light, medium, and heavy tactical vehicle system producers to facilitate critical system acquisition resource decisions by the United States Army Program Executive Officer, Tactical Wheeled Vehicles. The principal programs were the Family of Medium Tactical Vehicles (FMTV) production programs at Stewart & Stevenson Services, Inc.; the heavy TWV production programs at the Oshkosh Truck Corporation in Oshkosh, Wisconsin; and the light TWV and 2.5 ton remanufacturing production programs at the AM General Corporation in South Bend, Indiana. Each contractor`s production scenarios were analyzed and modeled to accurately quantify the relationship between production rates and unit costs. Specific objectives included identifying (1) Minimum Sustaining Rates to support current and future budgetary requirements and resource programming for potential follow-on procurements, (2) thresholds where production rate changes significantly affect unit costs, and (3) critical production program factors and their impacts to production rate versus unit cost relationships. Two different techniques were utilized initially in conducting the analyses. One technique principally focused on collecting and analyzing applicable historical production program information, where available, to develop a statistical predictive model. A second and much more exhaustive technique focused on a detailed modeling of each contractor`s production processes, flows, and operations. A standard architecture of multiple linked functional modules was used for each process model. Using the standard architecture, the individual modules were tailored to specific contractor operations. Each model contains detailed information on manpower, burden rates, material, material price/quantity relationships, capital, manufacturing support, program management, and all related direct and indirect costs applicable to the production programs.

  7. A software surety analysis process

    SciTech Connect

    Trauth, S.; Tempel, P.

    1995-11-01

    As part of the High Consequence System Surety project, this work was undertaken to explore, one approach to conducting a surety theme analysis for a software-driven system. Originally, plans were to develop a theoretical approach to the analysis, and then to validate and refine this process by applying it to the software being developed for the Weight and Leak Check System (WALS), an automated nuclear weapon component handling system. As with the development of the higher level High consequence System surety Process, this work was not completed due to changes in funding levels. This document describes the software analysis process, discusses its application in a software, environment, and outlines next steps that could be taken to further develop and apply the approach to projects.

  8. Using predictive uncertainty analysis to optimise tracer test design and data acquisition

    NASA Astrophysics Data System (ADS)

    Wallis, Ilka; Moore, Catherine; Post, Vincent; Wolf, Leif; Martens, Evelien; Prommer, Henning

    2014-07-01

    processes, followed by methane. Temperature data was assessed as the least informative of the solute tracers. However, taking costs of data acquisition into account, it could be shown that temperature data when used in conjunction with other tracers was a valuable and cost-effective marker species due to temperatures low cost to worth ratio. In contrast, the high costs of acquisition of methane data compared to its muted worth, highlighted methanes unfavourable return on investment. Areas of optimal monitoring bore position as well as optimal numbers of bores for the investigated injection site were also established. The proposed tracer test optimisation is done through the application of common use groundwater flow and transport models in conjunction with publicly available tools for predictive uncertainty analysis to provide modelers and practitioners with a powerful yet efficient and cost effective tool which is generally applicable and easily transferrable from the present study to many applications beyond the case study of injection of treated CSG produced water.

  9. Video-task acquisition in rhesus monkeys (Macaca mulatta) and chimpanzees (Pan troglodytes): a comparative analysis

    NASA Technical Reports Server (NTRS)

    Hopkins, W. D.; Washburn, D. A.; Hyatt, C. W.; Rumbaugh, D. M. (Principal Investigator)

    1996-01-01

    This study describes video-task acquisition in two nonhuman primate species. The subjects were seven rhesus monkeys (Macaca mulatta) and seven chimpanzees (Pan troglodytes). All subjects were trained to manipulate a joystick which controlled a cursor displayed on a computer monitor. Two criterion levels were used: one based on conceptual knowledge of the task and one based on motor performance. Chimpanzees and rhesus monkeys attained criterion in a comparable number of trials using a conceptually based criterion. However, using a criterion based on motor performance, chimpanzees reached criterion significantly faster than rhesus monkeys. Analysis of error patterns and latency indicated that the rhesus monkeys had a larger asymmetry in response bias and were significantly slower in responding than the chimpanzees. The results are discussed in terms of the relation between object manipulation skills and video-task acquisition.

  10. Hospital integration and vertical consolidation: an analysis of acquisitions in New York State.

    PubMed

    Huckman, Robert S

    2006-01-01

    While prior studies tend to view hospital integration through the lens of horizontal consolidation, I provide an analysis of its vertical aspects. I examine the effect of hospital acquisitions in New York State on the distribution of market share for major cardiac procedures across providers in target markets. I find evidence of benefits to acquirers via business stealing, with the resulting redistribution of volume across providers having small effects, if any, on total welfare with respect to cardiac care. The results of this analysis -- along with similar assessments for other services -- can be incorporated into future studies of hospital consolidation. PMID:16325946

  11. The Earthscope USArray Array Network Facility (ANF): Evolution of Data Acquisition, Processing, and Storage Systems

    NASA Astrophysics Data System (ADS)

    Davis, G. A.; Battistuz, B.; Foley, S.; Vernon, F. L.; Eakins, J. A.

    2009-12-01

    Since April 2004 the Earthscope USArray Transportable Array (TA) network has grown to over 400 broadband seismic stations that stream multi-channel data in near real-time to the Array Network Facility in San Diego. In total, over 1.7 terabytes per year of 24-bit, 40 samples-per-second seismic and state of health data is recorded from the stations. The ANF provides analysts access to real-time and archived data, as well as state-of-health data, metadata, and interactive tools for station engineers and the public via a website. Additional processing and recovery of missing data from on-site recorders (balers) at the stations is performed before the final data is transmitted to the IRIS Data Management Center (DMC). Assembly of the final data set requires additional storage and processing capabilities to combine the real-time data with baler data. The infrastructure supporting these diverse computational and storage needs currently consists of twelve virtualized Sun Solaris Zones executing on nine physical server systems. The servers are protected against failure by redundant power, storage, and networking connections. Storage needs are provided by a hybrid iSCSI and Fiber Channel Storage Area Network (SAN) with access to over 40 terabytes of RAID 5 and 6 storage. Processing tasks are assigned to systems based on parallelization and floating-point calculation needs. On-site buffering at the data-loggers provide protection in case of short-term network or hardware problems, while backup acquisition systems at the San Diego Supercomputer Center and the DMC protect against catastrophic failure of the primary site. Configuration management and monitoring of these systems is accomplished with open-source (Cfengine, Nagios, Solaris Community Software) and commercial tools (Intermapper). In the evolution from a single server to multiple virtualized server instances, Sun Cluster software was evaluated and found to be unstable in our environment. Shared filesystem

  12. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    SciTech Connect

    Schickert, Martin

    2015-03-31

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  13. Proceedings of the XIIIth IAGA Workshop on Geomagnetic Observatory Instruments, Data Acquisition, and Processing

    USGS Publications Warehouse

    Love, Jeffrey J.

    2009-01-01

    The thirteenth biennial International Association of Geomagnetism and Aeronomy (IAGA) Workshop on Geomagnetic Observatory Instruments, Data Acquisition and Processing was held in the United States for the first time on June 9-18, 2008. Hosted by the U.S. Geological Survey's (USGS) Geomagnetism Program, the workshop's measurement session was held at the Boulder Observatory and the scientific session was held on the campus of the Colorado School of Mines in Golden, Colorado. More than 100 participants came from 36 countries and 6 continents. Preparation for the workshop began when the USGS Geomagnetism Program agreed, at the close of the twelfth workshop in Belsk Poland in 2006, to host the next workshop. Working under the leadership of Alan Berarducci, who served as the chairman of the local organizing committee, and Tim White, who served as co-chairman, preparations began in 2007. The Boulder Observatory was extensively renovated and additional observation piers were installed. Meeting space on the Colorado School of Mines campus was arranged, and considerable planning was devoted to managing the many large and small issues that accompany an international meeting. Without the devoted efforts of both Alan and Tim, other Geomagnetism Program staff, and our partners at the Colorado School of Mines, the workshop simply would not have occurred. We express our thanks to Jill McCarthy, the USGS Central Region Geologic Hazards Team Chief Scientist; Carol A. Finn, the Group Leader of the USGS Geomagnetism Program; the USGS International Office; and Melody Francisco of the Office of Special Programs and Continuing Education of the Colorado School of Mines. We also thank the student employees that the Geomagnetism Program has had over the years and leading up to the time of the workshop. For preparation of the proceedings, thanks go to Eddie and Tim. And, finally, we thank our sponsors, the USGS, IAGA, and the Colorado School of Mines.

  14. Three-dimensional ultrasonic imaging of concrete elements using different SAFT data acquisition and processing schemes

    NASA Astrophysics Data System (ADS)

    Schickert, Martin

    2015-03-01

    Ultrasonic testing systems using transducer arrays and the SAFT (Synthetic Aperture Focusing Technique) reconstruction allow for imaging the internal structure of concrete elements. At one-sided access, three-dimensional representations of the concrete volume can be reconstructed in relatively great detail, permitting to detect and localize objects such as construction elements, built-in components, and flaws. Different SAFT data acquisition and processing schemes can be utilized which differ in terms of the measuring and computational effort and the reconstruction result. In this contribution, two methods are compared with respect to their principle of operation and their imaging characteristics. The first method is the conventional single-channel SAFT algorithm which is implemented using a virtual transducer that is moved within a transducer array by electronic switching. The second method is the Combinational SAFT algorithm (C-SAFT), also named Sampling Phased Array (SPA) or Full Matrix Capture/Total Focusing Method (TFM/FMC), which is realized using a combination of virtual transducers within a transducer array. Five variants of these two methods are compared by means of measurements obtained at test specimens containing objects typical of concrete elements. The automated SAFT imaging system FLEXUS is used for the measurements which includes a three-axis scanner with a 1.0 m × 0.8 m scan range and an electronically switched ultrasonic array consisting of 48 transducers in 16 groups. On the basis of two-dimensional and three-dimensional reconstructed images, qualitative and some quantitative results of the parameters image resolution, signal-to-noise ratio, measurement time, and computational effort are discussed in view of application characteristics of the SAFT variants.

  15. Data acquisition and analysis of the UNCOSS underwater explosive neutron sensor

    SciTech Connect

    Carasco, C.; Eleon, C.; Perot, B.; Boudergui, K.; Kondrasovs, V.; Corre, G.; Normand, S.; Sannie, G.; Woo, R.; Bourbotte, J. M.

    2011-07-01

    The purpose of the FP7 UNCOSS project (Underwater Coastal Sea Surveyor, http://www.uncoss-project.org) is to develop a neutron-based underwater explosive sensor to detect unexploded ordnance lying on the sea bottom. The Associated Particle Technique is used to focus the inspection on a suspicious object located by optical and electromagnetic sensors and to determine if there is an explosive charge inside. This paper presents the data acquisition electronics and data analysis software which have been developed for this project. The electronics digitize and process the signal in real-time based on a field programmable gate array structure to perform precise time-of-flight and gamma-ray energy measurements. UNCOSS software offers the basic tools to analyze the time-of-flight and energy spectra of the interrogated object. It allows to unfold the gamma-ray spectrum into pure elemental count proportions, mainly C, N, O, Fe, Al, Si, and Ca. The C, N, and O count fractions are converted into chemical proportions by taking into account the gamma-ray production cross sections, as well as neutron and photon attenuation in the different shields between the ROV (Remotely Operated Vehicle) and the explosive, such as the explosive iron shell, seawater, and ROV envelop. These chemical ratios are plotted in a two-dimensional (2D) barycentric representation to position the measured point with respect to common explosives. The systematic uncertainty due to the above attenuation effects and counting statistical fluctuations are combined with a Monte Carlo method to provide a 3D uncertainty area in a barycentric plot, which allows to determine the most probable detected materials in view to make a decision about the presence of explosive. (authors)

  16. Analysis and decision document in support of acquisition of steam supply for the Hanford 200 Area

    SciTech Connect

    Brown, D.R.; Daellenbach, K.K.; Hendrickson, P.L.; Kavanaugh, D.C.; Reilly, R.W.; Shankle, D.L.; Smith, S.A.; Weakley, S.A.; Williams, T.A. ); Grant, T.F. )

    1992-02-01

    The US Department of Energy (DOE) is now evaluating its facility requirements in support of its cleanup mission at Hanford. One of the early findings is that the 200-Area steam plants, constructed in 1943, will not meet future space heating and process needs. Because the 200 Area will serve as the primary area for waste treatment and long-term storage, a reliable steam supply is a critical element of Hanford operations. This Analysis and Decision Document (ADD) is a preliminary review of the steam supply options available to the DOE. The ADD contains a comprehensive evaluation of the two major acquisition options: line-term versus privatization. It addresses the life-cycle costs associated with each alternative, as well as factors such as contracting requirements and the impact of market, safety, security, and regulatory issues. Specifically, this ADD documents current and future steam requirements for the 200 Area, describes alternatives available to DOE for meeting these requirements, and compares the alternatives across a number of decision criteria, including life-cycle cost. DOE has currently limited the ADD evaluation alternatives to replacing central steam plants rather than expanding the study to include alternative heat sources, such as a distributed network of boilers or heat pumps. Thirteen project alternatives were analyzed in the ADD. One of the alternatives was the rehabilitation of the existing 200-East coal-fired facility. The other twelve alternatives are combinations of (1) coal- or gas-fueled plants, (2) steam-only or cogeneration facilities, (3) primary or secondary cogeneration of electricity, and (4) public or private ownership.

  17. Logistics Process Analysis ToolProcess Analysis Tool

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component wasmore » added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  18. A real time dynamic data acquisition and processing system for velocity, density, and total temperature fluctuation measurements

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1991-01-01

    The real time Dynamic Data Acquisition and Processing System (DDAPS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAPS replaces both a recording mechanism and a separate data processing system. DDAPS receives input from hot wire anemometers. Amplifiers and filters condition the signals with computer controlled modules. The analog signals are simultaneously digitized and digitally recorded on disk. Automatic acquisition collects necessary calibration and environment data. Hot wire sensitivities are generated and applied to the hot wire data to compute fluctuations. The presentation of the raw and processed data is accomplished on demand. The interface to DDAPS is described along with the internal mechanisms of DDAPS. A summary of operations relevant to the use of the DDAPS is also provided.

  19. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  20. Improvement of web-based data acquisition and management system for GOSAT validation lidar data analysis

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Takubo, Shoichiro; Kawasaki, Takeru; Abdullah, Indra Nugraha; Uchino, Osamu; Morino, Isamu; Yokota, Tatsuya; Nagai, Tomohiro; Sakai, Tetsu; Maki, Takashi; Arai, Kohei

    2013-01-01

    A web-base data acquisition and management system for GOSAT (Greenhouse gases Observation SATellite) validation lidar data-analysis has been developed. The system consists of data acquisition sub-system (DAS) and data management sub-system (DMS). DAS written in Perl language acquires AMeDAS (Automated Meteorological Data Acquisition System) ground-level local meteorological data, GPS Radiosonde upper-air meteorological data, ground-level oxidant data, skyradiometer data, skyview camera images, meteorological satellite IR image data and GOSAT validation lidar data. DMS written in PHP language demonstrates satellite-pass date and all acquired data. In this article, we briefly describe some improvement for higher performance and higher data usability. GPS Radiosonde upper-air meteorological data and U.S. standard atmospheric model in DAS automatically calculate molecule number density profiles. Predicted ozone density prole images above Saga city are also calculated by using Meteorological Research Institute (MRI) chemistry-climate model version 2 for comparison to actual ozone DIAL data.

  1. Studies of the reproducibility, acquisition and analysis of gastric emptying studies in pediatric population

    SciTech Connect

    Yoo, J.H.K.; Rosen, P.R.

    1984-01-01

    The analysis, reproducibility and acquisition of gastric emptying data in a pediatric population was evaluated by obtaining data simultaneously with anterior and posterior gamma camera detectors, repetitive studies in patients and by the use of power exponential analysis, in addition to conventional monoexponential methodology. 13 patients with a variety of gastroesophageal pathologies were studied with simultaneous anterior and posterior gamma camera data acquisition. Excluding 4 subjects with substantial emesis, there was no statistical difference in data obtained anteriorly and posteriorly. The anterior scan in general revealed more rapid initial emptying compared to the posterior scan, resulting in a smaller shape factor (S) when power exponential function analysis was employed. T1/2 using either simple monoexponential or power exponential calculations showed no difference for data obtained anteriorly or posteriorly. T3/4 showed larger values in posteriorly obtained data as compared to anteriorly obtained data. 7 patients had repetitive studies performed at intervals from 1-9 days. Data so obtained showed no statistical difference in T1/2, T3/4 or S derived, either by single exponential or power exponential. The authors conclude therefore that gastric emptying data in a pediatric age group appears to be reproducible in repetitive studies. There appears to be no difference in data acquired anteriorly or posteriorly. The utilization of a power exponential analysis of gastric emptying data may augment the description of data by providing a quantitative expression of a multiexponential function.

  2. The Probabilistic Analysis of Language Acquisition: Theoretical, Computational, and Experimental Analysis

    ERIC Educational Resources Information Center

    Hsu, Anne S.; Chater, Nick; Vitanyi, Paul M. B.

    2011-01-01

    There is much debate over the degree to which language learning is governed by innate language-specific biases, or acquired through cognition-general principles. Here we examine the probabilistic language acquisition hypothesis on three levels: We outline a novel theoretical result showing that it is possible to learn the exact "generative model"…

  3. Models, Processes, Principles, and Strategies: Second Language Acquisition in and out of the Classroom.

    ERIC Educational Resources Information Center

    Andersen, Roger W.

    1988-01-01

    A discussion of research on naturalistic second language acquisition (SLA) focuses on its relationship to the foreign language classroom context. It is argued that to attempt to relate natural SLA to classroom foreign language learning (FLL), a coherent and consistent theoretical framework is needed. The Cognitive-Interactionist Model is developed…

  4. The Representation and Processing of Familiar Faces in Dyslexia: Differences in Age of Acquisition Effects

    ERIC Educational Resources Information Center

    Smith-Spark, James H.; Moore, Viv

    2009-01-01

    Two under-explored areas of developmental dyslexia research, face naming and age of acquisition (AoA), were investigated. Eighteen dyslexic and 18 non-dyslexic university students named the faces of 50 well-known celebrities, matched for facial distinctiveness and familiarity. Twenty-five of the famous people were learned early in life, while the…

  5. Directed Blogging with Community College ESL Students: Its Effects on Awareness of Language Acquisition Processes

    ERIC Educational Resources Information Center

    Johnson, Cathy

    2012-01-01

    English as a Second Language (ESL) students often have problems progressing in their acquisition of the language and frequently do not know how to solve this dilemma. Many of them think of their second language studies as just another school subject that they must pass in order to move on to the next level, so few of them realize the metacognitive…

  6. Optionality in Second Language Acquisition: A Generative, Processing-Oriented Account

    ERIC Educational Resources Information Center

    Truscott, John

    2006-01-01

    The simultaneous presence in a learner's grammar of two features that should be mutually exclusive (optionality) typifies second language acquisition. But generative approaches have no good means of accommodating the phenomenon. The paper proposes one approach, based on Truscott and Sharwood Smith's (2004) MOGUL framework. In this framework,…

  7. Laser velocimeter data acquisition and real time processing using a microcomputer

    NASA Technical Reports Server (NTRS)

    Meyers, James F.

    1988-01-01

    An evolutionary data acquisition system for laser velocimeter applications is presented. The system uses a laser velocimeter (autocovariance) buffer interface to acquire the data, a WORM optical disk for storage, and a high-speed microcomputer for real time statistical computations.

  8. Input-Based Tasks and the Acquisition of Vocabulary and Grammar: A Process-Product Study

    ERIC Educational Resources Information Center

    Shintani, Natsuko

    2012-01-01

    The study reported in this article investigated the use of input-based tasks with young, beginner learners of English as a second language by examining both learning outcomes (i.e. acquisition) and the interactions that resulted from implementing the tasks. The participants were 15 learners, aged six, with no experience of second language (L2)…

  9. Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector

    NASA Astrophysics Data System (ADS)

    Jackson, M. E.; Holub, K.; Callahan, W.; Blatt, S.

    2014-12-01

    In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from GPS stations located near NOAA Radiosonde Observation (Upper-Air Observation) launch sites. A success metric was established that requires Trimble's PWV estimates to match ESRL/GSD's to within 1.5 mm 95% of the time, which corresponds to a ZTD uncertainty of less than 10 mm 95% of the time. Initial results indicate that Trimble/ENI data meet and exceed the ZTD metric, but for some stations PWV estimates are out of specification. These discrepancies are primarily due to how offsets between MET and GPS stations are handled and are easily resolved. Additional test networks are proposed that include low terrain/high moisture variability stations, high terrain/low moisture variability stations, as well as high terrain/high moisture variability stations. We will present results from further testing along with a timeline

  10. Proteomic Analysis of Embryogenesis and the Acquisition of Seed Dormancy in Norway Maple (Acer platanoides L.)

    PubMed Central

    Staszak, Aleksandra Maria; Pawłowski, Tomasz Andrzej

    2014-01-01

    The proteome of zygotic embryos of Acer platanoides L. was analyzed via high-resolution 2D-SDS-PAGE and MS/MS in order to: (1) identify significant physiological processes associated with embryo development; and (2) identify changes in the proteome of the embryo associated with the acquisition of seed dormancy. Seventeen spots were identified as associated with morphogenesis at 10 to 13 weeks after flowering (WAF). Thirty-three spots were associated with maturation of the embryo at 14 to 22 WAF. The greatest changes in protein abundance occurred at 22 WAF, when seeds become fully mature. Overall, the stage of morphogenesis was characterized by changes in the abundance of proteins (tubulins and actin) associated with the growth and development of the embryo. Enzymes related to energy supply were especially elevated, most likely due to the energy demand associated with rapid growth and cell division. The stage of maturation is crucial to the establishment of seed dormancy and is associated with a higher abundance of proteins involved in genetic information processing, energy and carbon metabolism and cellular and antioxidant processes. Results indicated that a glycine-rich RNA-binding protein and proteasome proteins may be directly involved in dormancy acquisition control, and future studies are warranted to verify this association. PMID:24941250

  11. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  12. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  13. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  14. Data acquisition and analysis for the Fermilab Collider RunII

    SciTech Connect

    Paul L. G. Lebrun et al.

    2004-07-07

    Operating and improving the understanding of the Fermilab Accelerator Complex for the colliding beam experiments requires advanced software methods and tools. The Shot Data Acquisition and Analysis (SDA) has been developed to fulfill this need. The SDA takes a standard set of critical data at relevant stages during the complex series of beam manipulations leading to {radical}(s) {approx} 2 TeV collisions. Data is stored in a relational database, and is served to programs and users via Web based tools. Summary tables are systematically generated during and after a store. Written entirely in Java, SDA supports both interactive tools and application interfaces used for in-depth analysis. In this talk, we present the architecture and described some of our analysis tools. We also present some results on the recent Tevatron performance as illustrations of the capabilities of SDA.

  15. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  16. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  17. DIADEM--a system for the interactive data acquisition and processing in an analytical laboratory.

    PubMed

    Peters, F; Teschner, W

    1979-09-01

    A conversational program for the acquisition of experimental data in a multi-user, multi-instrument computer system is described. It assists the researcher when recording on-time data. Due to the simple structure of the dialogue, no special knowledge of computer handling is required by the experimenter. Whereas the experimental methods are versatile, a uniform concept of the dialogue and the file structure is realized. PMID:487779

  18. Hippocampal Context Processing during Acquisition of a Predictive Learning Task Is Associated with Renewal in Extinction Recall.

    PubMed

    Lissek, Silke; Glaubitz, Benjamin; Schmidt-Wilcke, Tobias; Tegenthoff, Martin

    2016-05-01

    Renewal is defined as the recovery of an extinguished response if extinction and retrieval contexts differ. The context dependency of extinction, as demonstrated by renewal, has important implications for extinction-based therapies. Persons showing renewal (REN) exhibit higher hippocampal activation during extinction in associative learning than those without renewal (NOREN), demonstrating hippocampal context processing, and recruit ventromedial pFC in retrieval. Apart from these findings, brain processes generating renewal remain largely unknown. Conceivably, processing differences in task-relevant brain regions that ultimately lead to renewal may occur already in initial acquisition of associations. Therefore, in two fMRI studies, we investigated overall brain activation and hippocampal activation in REN and NOREN during acquisition of an associative learning task in response to presentation of a context alone or combined with a cue. Results of two studies demonstrated significant activation differences between the groups: In Study 1, a support vector machine classifier correctly assigned participants' brain activation patterns to REN and NOREN groups, respectively. In Study 2, REN and NOREN showed similar hippocampal involvement during context-only presentation, suggesting processing of novelty, whereas overall hippocampal activation to the context-cue compound, suggesting compound encoding, was higher in REN. Positive correlations between hippocampal activation and renewal level indicated more prominent hippocampal processing in REN. Results suggest that hippocampal processing of the context-cue compound rather than of context only during initial learning is related to a subsequent renewal effect. Presumably, REN participants use distinct encoding strategies during acquisition of context-related tasks, which reflect in their brain activation patterns and contribute to a renewal effect. PMID:26807840

  19. Design and demonstrate the performance of cryogenic components representative of space vehicles: Start basket liquid acquisition device performance analysis

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The objective was to design, fabricate and test an integrated cryogenic test article incorporating both fluid and thermal propellant management subsystems. A 2.2 m (87 in) diameter aluminum test tank was outfitted with multilayer insulation, helium purge system, low-conductive tank supports, thermodynamic vent system, liquid acquisition device and immersed outflow pump. Tests and analysis performed on the start basket liquid acquisition device and studies of the liquid retention characteristics of fine mesh screens are discussed.

  20. Instrumenting the Intelligence Analysis Process

    SciTech Connect

    Hampson, Ernest; Cowley, Paula J.

    2005-05-02

    The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

  1. Computerized data acquisition and analysis for measuring thermal diffusivity. [in thermoelectric space applications materials

    NASA Technical Reports Server (NTRS)

    Chmielewski, A.; Wood, C.; Vandersande, J.

    1985-01-01

    JPL has been leading a concentrated effort to develop improved thermoelectric materials for space applications. Thermoelectric generators are an attractive source of electrical energy for space power because of lack of moving parts and slow degradation of performance. Thermoelectric material is characterized by: Seebeck coefficient, electrical resistivity and thermal conductivity. To measure the high temperature thermal conductivity is experimentally very difficult. However, it can be calculated from the specific heat and thermal diffusivity which are easier to measure at high temperatures, especially using the flash method. Data acquisition and analysis for this experiment were automated at JPL using inexpensive microcomputer equipment. This approach is superior to tedious and less accurate manual analysis of data. It is also preferred to previously developed systems utilizing expensive minicomputers or mainframes.

  2. Reliability analysis for the facility data acquisition interface system upgrade at TA-55

    SciTech Connect

    Turner, W.J.; Pope, N.G.; Brown, R.E.

    1995-05-01

    Because replacement parts for the existing facility data acquisition interface system at TA-55 have become scarce and are no longer being manufactured, reliability studies were conducted to assess various possible replacement systems. A new control system, based on Allen-Bradley Programmable Logic Controllers (PLCs), was found to have a likely reliability 10 times that of the present system, if the existing Continuous Air Monitors (CAMS) were used. Replacement of the old CAMs with new CAMs will result in even greater reliability as these are gradually phased in. The new PLC-based system would provide for hot standby processors, redundant communications paths, and redundant power supplies, and would be expandable and easily maintained, as well as much more reliable. TA-55 is the Plutonium Processing Facility which processes and recovers Pu-239 from scrap materials.

  3. Radar data processing and analysis

    NASA Technical Reports Server (NTRS)

    Ausherman, D.; Larson, R.; Liskow, C.

    1976-01-01

    Digitized four-channel radar images corresponding to particular areas from the Phoenix and Huntington test sites were generated in conjunction with prior experiments performed to collect X- and L-band synthetic aperture radar imagery of these two areas. The methods for generating this imagery are documented. A secondary objective was the investigation of digital processing techniques for extraction of information from the multiband radar image data. Following the digitization, the remaining resources permitted a preliminary machine analysis to be performed on portions of the radar image data. The results, although necessarily limited, are reported.

  4. Structural analysis of vibroacoustical processes

    NASA Technical Reports Server (NTRS)

    Gromov, A. P.; Myasnikov, L. L.; Myasnikova, Y. N.; Finagin, B. A.

    1973-01-01

    The method of automatic identification of acoustical signals, by means of the segmentation was used to investigate noises and vibrations in machines and mechanisms, for cybernetic diagnostics. The structural analysis consists of presentation of a noise or vibroacoustical signal as a sequence of segments, determined by the time quantization, in which each segment is characterized by specific spectral characteristics. The structural spectrum is plotted as a histogram of the segments, also as a relation of the probability density of appearance of a segment to the segment type. It is assumed that the conditions of ergodic processes are maintained.

  5. DigiFract: A software and data model implementation for flexible acquisition and processing of fracture data from outcrops

    NASA Astrophysics Data System (ADS)

    Hardebol, N. J.; Bertotti, G.

    2013-04-01

    This paper presents the development and use of our new DigiFract software designed for acquiring fracture data from outcrops more efficiently and more completely than done with other methods. Fracture surveys often aim at measuring spatial information (such as spacing) directly in the field. Instead, DigiFract focuses on collecting geometries and attributes and derives spatial information through subsequent analyses. Our primary development goal was to support field acquisition in a systematic digital format and optimized for a varied range of (spatial) analyses. DigiFract is developed using the programming interface of the Quantum Geographic Information System (GIS) with versatile functionality for spatial raster and vector data handling. Among other features, this includes spatial referencing of outcrop photos, and tools for digitizing geometries and assigning attribute information through a graphical user interface. While a GIS typically operates in map-view, DigiFract collects features on a surface of arbitrary orientation in 3D space. This surface is overlain with an outcrop photo and serves as reference frame for digitizing geologic features. Data is managed through a data model and stored in shapefiles or in a spatial database system. Fracture attributes, such as spacing or length, is intrinsic information of the digitized geometry and becomes explicit through follow-up data processing. Orientation statistics, scan-line or scan-window analyses can be performed from the graphical user interface or can be obtained through flexible Python scripts that directly access the fractdatamodel and analysisLib core modules of DigiFract. This workflow has been applied in various studies and enabled a faster collection of larger and more accurate fracture datasets. The studies delivered a better characterization of fractured reservoirs analogues in terms of fracture orientation and intensity distributions. Furthermore, the data organisation and analyses provided more

  6. Hormonal Contraception and the Risk of HIV Acquisition: An Individual Participant Data Meta-analysis

    PubMed Central

    Morrison, Charles S.; Chen, Pai-Lien; Kwok, Cynthia; Baeten, Jared M.; Brown, Joelle; Crook, Angela M.; Van Damme, Lut; Delany-Moretlwe, Sinead; Francis, Suzanna C.; Friedland, Barbara A.; Hayes, Richard J.; Heffron, Renee; Kapiga, Saidi; Karim, Quarraisha Abdool; Karpoff, Stephanie; Kaul, Rupert; McClelland, R. Scott; McCormack, Sheena; McGrath, Nuala; Myer, Landon; Rees, Helen; van der Straten, Ariane; Watson-Jones, Deborah; van de Wijgert, Janneke H. H. M.; Stalter, Randy; Low, Nicola

    2015-01-01

    Background Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. Methods and Findings Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15–49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24–1.83) for DMPA use, 1.24 (95% CI 0.84–1.82) for NET-EN use, and 1.03 (95% CI 0.88–1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23–1.67) and NET-EN use (aHR 1.32, 95% CI 1.08–1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99–1.50; for NET-EN use 0.67, 95% CI 0.47–0.96; and for COC use 0.91, 95% CI 0.73–1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC–HIV relationship. Conclusions This IPD meta-analysis found no evidence that COC or NET-EN use increases women’s risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe

  7. Development and application of a model for the analysis of trades between space launch system operations and acquisition costs

    NASA Astrophysics Data System (ADS)

    Nix, Michael B.

    2005-12-01

    Early design decisions in the development of space launch systems determine the costs to acquire and operate launch systems. Some sources indicate that as much as 90% of life cycle costs are fixed by the end of the critical design review phase. System characteristics determined by these early decisions are major factors in the acquisition cost of flight hardware elements and facilities and influence operations costs through the amount of maintenance and support labor required to sustain system function. Operations costs are also dependent on post-development management decisions regarding how much labor will be deployed to meet requirements of market demand and ownership profit. The ability to perform early trade-offs between these costs is vital to the development of systems that have the necessary capacity to provide service and are profitable to operate. An Excel-based prototype model was developed for making early analyses of trade-offs between the costs to operate a space launch system and to acquire the necessary assets to meet a given set of operational requirements. The model, integrating input from existing models and adding missing capability, allows the user to make such trade-offs across a range of operations concepts (required flight rates, staffing levels, shifts per workday, workdays per week and per year, unreliability, wearout and depot maintenance) and the number, type and capability of assets (flight hardware elements, processing and supporting facilities and infrastructure). The costs and capabilities of hypothetical launch systems can be modeled as a function of interrelated turnaround times and labor resource levels, and asset loss and retirement. The number of flight components and facilities required can be calculated and the operations and acquisition costs compared for a specified scenario. Findings, based on the analysis of a hypothetical two stage to orbit, reusable, unmanned launch system, indicate that the model is suitable for the

  8. Resource Prospector Instrumentation for Lunar Volatiles Prospecting, Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Captain, J.; Elphic, R.; Colaprete, A.; Zacny, Kris; Paz, A.

    2016-01-01

    Data gathered from lunar missions within the last two decades have significantly enhanced our understanding of the volatile resources available on the lunar surface, specifically focusing on the polar regions. Several orbiting missions such as Clementine and Lunar Prospector have suggested the presence of volatile ices and enhanced hydrogen concentrations in the permanently shadowed regions of the moon. The Lunar Crater Observation and Sensing Satellite (LCROSS) mission was the first to provide direct measurement of water ice in a permanently shadowed region. These missions with other orbiting assets have laid the groundwork for the next step in the exploration of the lunar surface; providing ground truth data of the volatiles by mapping the distribution and processing lunar regolith for resource extraction. This next step is the robotic mission Resource Prospector (RP). Resource Prospector is a lunar mission to investigate 'strategic knowledge gaps' (SKGs) for in-situ resource utilization (ISRU). The mission is proposed to land in the lunar south pole near a permanently shadowed crater. The landing site will be determined by the science team with input from broader international community as being near traversable landscape that has a high potential of containing elevated concentrations of volatiles such as water while maximizing mission duration. A rover will host the Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) payload for resource mapping and processing. The science instruments on the payload include a 1-meter drill, neutron spectrometer, a near infrared spectrometer, an operations camera, and a reactor with a gas chromatograph-mass spectrometer for volatile analysis. After the RP lander safely delivers the rover to the lunar surface, the science team will guide the rover team on the first traverse plan. The neutron spectrometer (NS) and near infrared (NIR) spectrometer instruments will be used as prospecting tools to guide

  9. Hyperspectral image acquisition and analysis of cultured bacteria for the discrimination of urinary tract infections.

    PubMed

    Turra, Giovanni; Conti, Nicola; Signoroni, Alberto

    2015-08-01

    Because of their widespread diffusion and impact on human health, early identification of pathogens responsible for urinary tract infections (UTI) is one of the main challenges of clinical microbiology. Currently, bacteria culturing on Chromogenic plates is widely adopted for UTI detection for its readily interpretable visual outcomes. However, the search of alternate solutions can be highly attractive, especially in the rapidly developing context of bacteriology laboratory automation and digitization, as long as they can improve cost-effectiveness or allow early discrimination. In this work, we consider and develop hyperspectral image acquisition and analysis solutions to verify the feasibility of a "virtual chromogenic agar" approach, based on the acquisition of spectral signatures from bacterial colonies growing on blood agar plates, and their interpretation by means of machine learning solutions. We implemented and tested two classification approaches (PCA+SVM and RSIMCA) that evidenced good capability to discriminate among five selected UTI bacteria. For its better performance, robustness and attitude to work with an expanding set of pathogens, we conclude that the RSIMCA-based approach is worth to be further investigated in a clinical usage perspective. PMID:26736373

  10. Metabolome analysis of Arabidopsis thaliana roots identifies a key metabolic pathway for iron acquisition.

    PubMed

    Schmidt, Holger; Günther, Carmen; Weber, Michael; Spörlein, Cornelia; Loscher, Sebastian; Böttcher, Christoph; Schobert, Rainer; Clemens, Stephan

    2014-01-01

    Fe deficiency compromises both human health and plant productivity. Thus, it is important to understand plant Fe acquisition strategies for the development of crop plants which are more Fe-efficient under Fe-limited conditions, such as alkaline soils, and have higher Fe density in their edible tissues. Root secretion of phenolic compounds has long been hypothesized to be a component of the reduction strategy of Fe acquisition in non-graminaceous plants. We therefore subjected roots of Arabidopsis thaliana plants grown under Fe-replete and Fe-deplete conditions to comprehensive metabolome analysis by gas chromatography-mass spectrometry and ultra-pressure liquid chromatography electrospray ionization quadrupole time-of-flight mass spectrometry. Scopoletin and other coumarins were found among the metabolites showing the strongest response to two different Fe-limited conditions, the cultivation in Fe-free medium and in medium with an alkaline pH. A coumarin biosynthesis mutant defective in ortho-hydroxylation of cinnamic acids was unable to grow on alkaline soil in the absence of Fe fertilization. Co-cultivation with wild-type plants partially rescued the Fe deficiency phenotype indicating a contribution of extracellular coumarins to Fe solubilization. Indeed, coumarins were detected in root exudates of wild-type plants. Direct infusion mass spectrometry as well as UV/vis spectroscopy indicated that coumarins are acting both as reductants of Fe(III) and as ligands of Fe(II). PMID:25058345

  11. Metabolome Analysis of Arabidopsis thaliana Roots Identifies a Key Metabolic Pathway for Iron Acquisition

    PubMed Central

    Schmidt, Holger; Günther, Carmen; Weber, Michael; Spörlein, Cornelia; Loscher, Sebastian; Böttcher, Christoph; Schobert, Rainer; Clemens, Stephan

    2014-01-01

    Fe deficiency compromises both human health and plant productivity. Thus, it is important to understand plant Fe acquisition strategies for the development of crop plants which are more Fe-efficient under Fe-limited conditions, such as alkaline soils, and have higher Fe density in their edible tissues. Root secretion of phenolic compounds has long been hypothesized to be a component of the reduction strategy of Fe acquisition in non-graminaceous plants. We therefore subjected roots of Arabidopsis thaliana plants grown under Fe-replete and Fe-deplete conditions to comprehensive metabolome analysis by gas chromatography-mass spectrometry and ultra-pressure liquid chromatography electrospray ionization quadrupole time-of-flight mass spectrometry. Scopoletin and other coumarins were found among the metabolites showing the strongest response to two different Fe-limited conditions, the cultivation in Fe-free medium and in medium with an alkaline pH. A coumarin biosynthesis mutant defective in ortho-hydroxylation of cinnamic acids was unable to grow on alkaline soil in the absence of Fe fertilization. Co-cultivation with wild-type plants partially rescued the Fe deficiency phenotype indicating a contribution of extracellular coumarins to Fe solubilization. Indeed, coumarins were detected in root exudates of wild-type plants. Direct infusion mass spectrometry as well as UV/vis spectroscopy indicated that coumarins are acting both as reductants of Fe(III) and as ligands of Fe(II). PMID:25058345

  12. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time- and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time- and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments,commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  13. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG; a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a `repeat` sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  14. SIG. Signal Processing, Analysis, & Display

    SciTech Connect

    Hernandez, J.; Lager, D.; Azevedo, S.

    1992-01-22

    SIG is a general-purpose signal processing, analysis, and display program. Its main purpose is to perform manipulations on time-and frequency-domain signals. However, it has been designed to ultimately accommodate other representations for data such as multiplexed signals and complex matrices. Two user interfaces are provided in SIG - a menu mode for the unfamiliar user and a command mode for more experienced users. In both modes errors are detected as early as possible and are indicated by friendly, meaningful messages. An on-line HELP package is also included. A variety of operations can be performed on time and frequency-domain signals including operations on the samples of a signal, operations on the entire signal, and operations on two or more signals. Signal processing operations that can be performed are digital filtering (median, Bessel, Butterworth, and Chebychev), ensemble average, resample, auto and cross spectral density, transfer function and impulse response, trend removal, convolution, Fourier transform and inverse window functions (Hamming, Kaiser-Bessel), simulation (ramp, sine, pulsetrain, random), and read/write signals. User definable signal processing algorithms are also featured. SIG has many options including multiple commands per line, command files with arguments, commenting lines, defining commands, and automatic execution for each item in a repeat sequence. Graphical operations on signals and spectra include: x-y plots of time signals; real, imaginary, magnitude, and phase plots of spectra; scaling of spectra for continuous or discrete domain; cursor zoom; families of curves; and multiple viewports.

  15. Automated ground data acquisition and processing system for calibration and performance assessment of the EO-1 Advanced Land Imager

    NASA Astrophysics Data System (ADS)

    Viggh, Herbert E. M.; Mendenhall, Jeffrey A.; Sayer, Ronald W.; Stuart, J. S.; Gibbs, Margaret D.

    1999-09-01

    The calibration and performance assessment of the Earth Observing-1 (EO-1) Advanced Land Imager (ALI) required a ground data system for acquiring and processing ALI data. In order to meet tight schedule and budget requirements, an automated system was developed that could be run by a single operator. This paper describes the overall system and the individual Electrical Ground Support Equipment (EGSE) and computer components used. The ALI Calibration Control Node (ACCN) serves as a test executive with a single graphical user interface to the system, controlling calibration equipment and issuing data acquisition and processing requests to the other EGSE and computers. EGSE1, a custom data acquisition syste, collects ALI science data and also passes ALI commanding and housekeeping telemetry collection requests to EGSE2 and EGSE3 which are implemented on an ASIST workstation. The performance assessment machine, stores and processes collected ALI data, automatically displaying quick-look processing results. The custom communications protocol developed to interface these various machines and to automate their interactions is described, including the various modes of operation needed to support spatial, radiometric, spectral, and functional calibration and performance assessment of the ALI.

  16. Fast nearly ML estimation of Doppler frequency in GNSS signal acquisition process.

    PubMed

    Tang, Xinhua; Falletti, Emanuela; Lo Presti, Letizia

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  17. Fast Nearly ML Estimation of Doppler Frequency in GNSS Signal Acquisition Process

    PubMed Central

    Tang, Xinhua; Falletti, Emanuela; Presti, Letizia Lo

    2013-01-01

    It is known that signal acquisition in Global Navigation Satellite System (GNSS) field provides a rough maximum-likelihood (ML) estimate based on a peak search in a two-dimensional grid. In this paper, the theoretical mathematical expression of the cross-ambiguity function (CAF) is exploited to analyze the grid and improve the accuracy of the frequency estimate. Based on the simple equation derived from this mathematical expression of the CAF, a family of novel algorithms is proposed to refine the Doppler frequency estimate with respect to that provided by a conventional acquisition method. In an ideal scenario where there is no noise and other nuisances, the frequency estimation error can be theoretically reduced to zero. On the other hand, in the presence of noise, the new algorithm almost reaches the Cramer-Rao Lower Bound (CRLB) which is derived as benchmark. For comparison, a least-square (LS) method is proposed. It is shown that the proposed solution achieves the same performance of LS, but requires a dramatically reduced computational burden. An averaging method is proposed to mitigate the influence of noise, especially when signal-to-noise ratio (SNR) is low. Finally, the influence of the grid resolution in the search space is analyzed in both time and frequency domains. PMID:23628761

  18. The rate of acquisition of formal operational schemata in adolescence: A secondary analysis

    NASA Astrophysics Data System (ADS)

    Eckstein, Shulamith G.; Shemesh, Michal

    A theoretical model of cognitive development is applied to the study of the acquisition of formal operational schemata by adolescents. The model predicts that the proportion of adolescents who have not yet acquired the ability to perform a a specific Piagetian-like task is an exponentially decreasing function of age. The model has been used to analyze the data of two large-scale studies performed in the United States and in Israel. The functional dependence upon age was found to be the same in both countries for tasks which are used to assess the following formal operations: proportional reasoning, probabilistic reasoning, correlations, and combinatorial analysis. Different functional dependence was found for tasks assessing conservation, control of variables, and prepositional logic. These results give support to the unity hypothesis of cognitive development, that is, the hypothesis that the various schemata of formal thought appear simultaneously.

  19. Morphological Awareness in Literacy Acquisition of Chinese Second Graders: A Path Analysis.

    PubMed

    Zhang, Haomin

    2016-02-01

    The present study tested a path diagram regarding the contribution of morphological awareness (MA) to early literacy acquisition among Chinese-speaking second graders ([Formula: see text]). Three facets of MA were addressed, namely derivational awareness, compound awareness and compound structure awareness. The model aimed to test a theory of causal order among measures of MA and literacy outcomes. Drawing upon multivariate path analysis, direct and indirect effects of MA were analyzed to identify their role in literacy performance among young children. Results revealed that all three facets of MA made significant contributions to lexical inference ability. In addition, compound awareness showed a unique and significant contribution to vocabulary knowledge. It was also observed that lexical inference ability had a mediating effect predictive of both vocabulary knowledge and reading comprehension. Moreover, vocabulary knowledge mediated the effect of MA on reading comprehension. However, no significant contribution of MA to reading comprehension was found after controlling for lexical inference ability and vocabulary knowledge. PMID:25308872

  20. The use of an optical data acquisition system for bladed disk vibration analysis

    NASA Technical Reports Server (NTRS)

    Lawrence, C.; Meyn, E. H.

    1984-01-01

    A new concept in instrumentation was developed by engineers at NASA Lewis Research Center to collect vibration data from multi-bladed rotors. This new concept, known as the optical data acquisition system, uses optical transducers to measure bladed tip delections by reflection light beams off the tips of the blades as they pass in front of the optical transducer. By using an array of transducers around the perimeter of the rotor, detailed vibration signals can be obtained. In this study, resonant frequencies and mode shapes were determined for a 56 bladed rotor using the optical system. Frequency data from the optical system was also compared to data obtained from strain gauge measurements and finite element analysis and was found to be in good agreement.

  1. Radiation acquisition and RBF neural network analysis on BOF end-point control

    NASA Astrophysics Data System (ADS)

    Zhao, Qi; Wen, Hong-yuan; Zhou, Mu-chun; Chen, Yan-ru

    2008-12-01

    There are some problems in Basic Oxygen Furnace (BOF) steelmaking end-point control technology at present. A new BOF end-point control model was designed, which was based on the character of carbon oxygen reaction in Basic Oxygen Furnace steelmaking process. The image capture and transformation system was established by Video for Windows (VFW) library function, which is a video software development package promoted by Microsoft Corporation. In this paper, the Radial Basic Function (RBF) neural network model was established by using the real-time acquisition information. The input parameters can acquire easily online and the output parameter is the end-point time, which can compare with the actual value conveniently. The experience results show that the predication result is ideal and the experiment results show the model can work well in the steelmaking adverse environment.

  2. Computer programs for the acquisition and analysis of eddy-current array probe data

    SciTech Connect

    Pate, J.R.; Dodd, C.V.

    1996-07-01

    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC`s mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided.

  3. A strong-motion network in Northern Italy (RAIS): data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Augliera, Paolo; Ezio, D'alema; Simone, Marzorati; Marco, Massa

    2010-05-01

    The necessity of a dense network in Northern Italy started from the lack of available data after the occurrence of the 24th November 2004, Ml 5.2, Salò earthquake. Since 2006 many efforts have been made by the INGV (Italian National Institute for Geophysics and Volcanology), department of Milano-Pavia (hereinafter INGV MI-PV), to improve the strong-motion monitoring of the Northern Italy regions. At the end of 2007, the RAIS (Strong-Motion Network in Northern Italy) included 19 stations equipped with Kinemetrics Episensor FBA ES-T coupled with 5 20-bits Lennartz Mars88/MC and 14 24-bits Reftek 130-01 seismic recorders. In this step, we achieved the goal to reduce the average inter-distances between strong-motion stations, installed in the area under study, from about 40 km to 15 km. In this period the GSM-modem connection between the INGV MI-PV acquisition center and the remote stations was used. Starting to 2008, in order to assure real-time recordings, with the aim to integrate RAIS data in the calculation of the Italian ground-shaking maps, the main activity was devoted to update the data acquisition of the RAIS strong-motion network. Moreover a phase that will lead to replace the original recorders with 24-bits GAIA2 systems (directly produced by INGV-CNT laboratory, Rome) has been starting. Today 11 out of the 22 stations are already equipped by GAIA2 and their original GSM-modem acquisition system were already replaced with real-time connections, based on TCP/IP or Wi-Fi links. All real time stations storage data using the MiniSEED format. The management and data exchange are assured by the SEED-Link and Earthworm packages. The metadata dissemination is achieved through the website, where the computed strong motion parameters, together the amplification functions, for each recording station are available for each recorded events. The waveforms, for earthquake with local magnitude higher than 3.0 are now collected in the ITalian ACcelerometric Archive (http://itaca.mi.ingv.it).

  4. Financial analysis of technology acquisition using fractionated lasers as a model.

    PubMed

    Jutkowitz, Eric; Carniol, Paul J; Carniol, Alan R

    2010-08-01

    Ablative fractional lasers are among the most advanced and costly devices on the market. Yet, there is a dearth of published literature on the cost and potential return on investment (ROI) of such devices. The objective of this study was to provide a methodological framework for physicians to evaluate ROI. To facilitate this analysis, we conducted a case study on the potential ROI of eight ablative fractional lasers. In the base case analysis, a 5-year lease and a 3-year lease were assumed as the purchase option with a $0 down payment and 3-month payment deferral. In addition to lease payments, service contracts, labor cost, and disposables were included in the total cost estimate. Revenue was estimated as price per procedure multiplied by total number of procedures in a year. Sensitivity analyses were performed to account for variability in model assumptions. Based on the assumptions of the model, all lasers had higher ROI under the 5-year lease agreement compared with that for the 3-year lease agreement. When comparing results between lasers, those with lower operating and purchase cost delivered a higher ROI. Sensitivity analysis indicates the model is most sensitive to purchase method. If physicians opt to purchase the device rather than lease, they can significantly enhance ROI. ROI analysis is an important tool for physicians who are considering making an expensive device acquisition. However, physicians should not rely solely on ROI and must also consider the clinical benefits of a laser. PMID:20665406

  5. EARLY SYNTACTIC ACQUISITION.

    ERIC Educational Resources Information Center

    KELLEY, K.L.

    THIS PAPER IS A STUDY OF A CHILD'S EARLIEST PRETRANSFORMATIONAL LANGUAGE ACQUISITION PROCESSES. A MODEL IS CONSTRUCTED BASED ON THE ASSUMPTIONS (1) THAT SYNTACTIC ACQUISITION OCCURS THROUGH THE TESTING OF HYPOTHESES REFLECTING THE INITIAL STRUCTURE OF THE ACQUISITION MECHANISM AND THE LANGUAGE DATA TO WHICH THE CHILD IS EXPOSED, AND (2) THAT…

  6. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    NASA Astrophysics Data System (ADS)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  7. The origins of age of acquisition and typicality effects: Semantic processing in aphasia and the ageing brain.

    PubMed

    Räling, Romy; Schröder, Astrid; Wartenburger, Isabell

    2016-06-01

    Age of acquisition (AOA) has frequently been shown to influence response times and accuracy rates in word processing and constitutes a meaningful variable in aphasic language processing, while its origin in the language processing system is still under debate. To find out where AOA originates and whether and how it is related to another important psycholinguistic variable, namely semantic typicality (TYP), we studied healthy, elderly controls and semantically impaired individuals using semantic priming. For this purpose, we collected reaction times and accuracy rates as well as event-related potential data in an auditory category-member-verification task. The present results confirm a semantic origin of TYP, but question the same for AOA while favouring its origin at the phonology-semantics interface. The data are further interpreted in consideration of recent theories of ageing. PMID:27106392

  8. The effect of instruction by a professional scientist on the acquisition of integrated process skills and the science-related attitudes of eighth-grade students

    NASA Astrophysics Data System (ADS)

    Owens, Katharine Donner

    This study investigated the effect of instruction by a professional scientist on the acquisition of science integrated process skills and the science-related attitudes of eighth-grade students. Eighty-two students from four intact classes in south Mississippi junior high schools participated in this study. Two experimental groups were taught a problem solving curriculum over a six week period by professional chemists; one experimental group had an additional six weeks of instruction by a professional engineer. Two control groups had science instruction by their classroom teachers. Homogeneity of the groups related to basic skills and science attitudes was determined and students drew their perception of a scientist before any instruction began. At the end of the intervention period students in all groups were given the Test of Science-Related Attitudes, the Test of Integrated Process Skills II, and a Draw-A-Scientist Test. The statistical procedures of the Wilks Lambda MANOVA, a univariate post hoc test, a split plot analysis of variance, and a one-way analysis of variance were used to test the hypotheses at the 0.05 significance level. Students' drawings of scientists were analyzed for the presence of stereotypic characteristics. Scores on all tests were analyzed according to gender and to group membership. There was a statistically significant difference in the science-related attitudes and the acquisition of science process skills between treatment groups. The experimental group taught by a professional chemist for six weeks scored higher on the test of process skills and had more positive attitudes toward careers in science and the normality of scientists than the control groups. There was a significant decline in stereotypic characteristics seen in the drawings of scientists by students who had longer instruction by two professional scientists. There was no statistically significant difference between male and female students and no interaction effect between

  9. Behavior analysis for information acquisition of scientists and engineers in industry (1)

    NASA Astrophysics Data System (ADS)

    Onodera, Natsuo; Mizukami, Masayuki; Marumo, Kazuaki; Nishimura, Kunio

    Information acquisition actions for a week were recorded by 660 scientists and engineers in about 60 industrial companies. Approximately 3600 records of the actions were analysed in terms of the kind of information to be acquired, the tools to be accessed for information acquisition, the cause for, and the purpase of, information acquisition, necessity and urgency of information to be acquired, the deqee of satisfaction for information abtained, and the pesiod needed for information acquisition. Additionally, a part of poinelists were rent a questionnaise and inteririewed to answer time and money spended for information acquisition and evaluation of various tools for information acquisition, particularly of commercial databases. The main accessing tools are various primary materials, personal connection and personal files. Commercial databases are used 0.44 times per week by a scientist/engineer.

  10. Behavior analysis for information acquisition of scientists and engineers in industry (2)

    NASA Astrophysics Data System (ADS)

    Onodera, Natsuo; Mizukami, Masayuki; Marumo, Kazuaki; Nishimura, Kunio

    Information acquisition actions for a week were recorded by 600 scientists and engineers in about 60 industrial companies. Approximately 3600 records of the actions were analysed in terms of the kind of information to be acquired, the tools to be accessed for information acquisition, the cause for, and the purpose of, information acquisition, necessity and urgency of information to be acquired, the degree of satisfaction for information obtained, and the period needed for information acquisition. Additionally, a part of panelists were sent a questionnaire and interviewed to answer time and money spended for information acquisition and evaluation of various tools for information acquisition, particularly of commercial databases. The main accessing tools are various primary materials, personal connection and personal files. Commercial databases are used 0.44 times per week by a scientist/engineer.

  11. The Role of Unconscious Information Processing in the Acquisition and Learning of Instructional Messages

    ERIC Educational Resources Information Center

    Kuldas, Seffetullah; Bakar, Zainudin Abu; Ismail, Hairul Nizam

    2012-01-01

    This review investigates how the unconscious information processing can create satisfactory learning outcomes, and can be used to ameliorate the challenges of teaching students to regulate their learning processes. The search for the ideal model of human information processing as regards achievement of teaching and learning objectives is a…

  12. Acquisition process of typing skill using hierarchical materials in the Japanese language.

    PubMed

    Ashitaka, Yuki; Shimada, Hiroyuki

    2014-08-01

    In the present study, using a new keyboard layout with only eight keys, we conducted typing training for unskilled typists. In this task, Japanese college students received training in typing words consisting of a pair of hiragana characters with four keystrokes, using the alphabetic input method, while keeping the association between the keys and typists' finger movements; the task was constructed so that chunking was readily available. We manipulated the association between the hiragana characters and alphabet letters (hierarchical materials: overlapped and nonoverlapped mappings). Our alphabet letter materials corresponded to the regular order within each hiragana word (within the four letters, the first and third referred to consonants, and the second and fourth referred to vowels). Only the interkeystroke intervals involved in the initiation of typing vowel letters showed an overlapping effect, which revealed that the effect was markedly large only during the early period of skill development (the effect for the overlapped mapping being larger than that for the nonoverlapped mapping), but that it had diminished by the time of late training. Conversely, the response time and the third interkeystroke interval, which are both involved in the latency of typing a consonant letter, did not reveal an overlapped effect, suggesting that chunking might be useful with hiragana characters rather than hiragana words. These results are discussed in terms of the fan effect and skill acquisition. Furthermore, we discuss whether there is a need for further research on unskilled and skilled Japanese typists. PMID:24874261

  13. Lexical processing and organization in bilingual first language acquisition: Guiding future research.

    PubMed

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-06-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between 2 languages in early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. (PsycINFO Database Record PMID:26866430

  14. Lexical Processing and Organization in Bilingual First Language Acquisition: Guiding Future Research

    PubMed Central

    DeAnda, Stephanie; Poulin-Dubois, Diane; Zesiger, Pascal; Friend, Margaret

    2016-01-01

    A rich body of work in adult bilinguals documents an interconnected lexical network across languages, such that early word retrieval is language independent. This literature has yielded a number of influential models of bilingual semantic memory. However, extant models provide limited predictions about the emergence of lexical organization in bilingual first language acquisition (BFLA). Empirical evidence from monolingual infants suggests that lexical networks emerge early in development as children integrate phonological and semantic information. These findings tell us little about the interaction between two languages in the early bilingual memory. To date, an understanding of when and how languages interact in early bilingual development is lacking. In this literature review, we present research documenting lexical-semantic development across monolingual and bilingual infants. This is followed by a discussion of current models of bilingual language representation and organization and their ability to account for the available empirical evidence. Together, these theoretical and empirical accounts inform and highlight unexplored areas of research and guide future work on early bilingual memory. PMID:26866430

  15. Three-dimensional surface topography acquisition and analysis for firearm identification.

    PubMed

    Senin, Nicola; Groppetti, Roberto; Garofano, Luciano; Fratini, Paolo; Pierni, Michele

    2006-03-01

    In the last decade, computer-based systems for the comparison of microscopic firearms evidence have been the subject of considerable research work because of their expected capability of supporting the firearms examiner through the automated analysis of large amounts of evidence. The Integrated Ballistics Identification System, which is based on a two-dimensional representation of the specimen surface, has been widely adopted in forensic laboratories worldwide. More recently, some attempts to develop systems based on three-dimensional (3D) representations of the specimen surface have been made, both in the literature and as industrial products, such as BulletTRAX-3D, but fundamental limitations in achieving fully automated identification remain. This work analyzes the advantages and disadvantages of a 3D-based approach by proposing an approach and a prototype system for firearms evidence comparison that is based on the acquisition and analysis of the 3D surface topography of specimens, with particular reference to cartridge cases. The concept of 3D virtual comparison microscope is introduced, whose purpose is not to provide fully automated identification, but to show how the availability of 3D shape information can provide a whole new set of verification means, some of them being described and discussed in this work, specifically, visual enhancement tools and quantitative measurement of shape properties, for supporting, not replacing, the firearm examiner in reaching the final decision. PMID:16566761

  16. A program for the acquisition and analysis of RBS, HFS, ERD, and PIXE spectra

    NASA Astrophysics Data System (ADS)

    Strathman, M. D.

    1997-02-01

    DETECTOR/DETACQ are a suite of acquisition and analysis programs designed to operate on Microsoft 32-Bit platforms such as WinNT™ and Win95™. DETECTOR, the analysis program, is capable of analyzing Rutherford Backscattering data, Hydrogen Forward Scattering data, Elastic Recoil data, and Particle Induced X-ray Emission data. In addition the program includes substantial capabilities in analyzing Lateral and Angle Resolved Images acquired from any of the above analytical techniques. DETECTOR incorporates the ability to use either five parameter or eight parameter stopping power data, screened coulomb cross sections, and an independent stopping foil description. DETECTOR allows the user to use a single sample description to fit experimental data in multiple data windows. Each data window can hold any of the above experimental data types and the theoretical fit to the experimental data, where each window has an independent calibration and excitation probe. This capability allows user to use the several different techniques (even if they were acquired at different times) and then combine them to obtain a better description of the sample which is being analyzed. DETECTOR/DETACQ are both designed with Windows™ compatible graphical user interfaces. The majority of the programs are written in Microsoft Visual Basic 4.0 and therefore incorporate all of the conventional user interaction tools.

  17. The Influence of Input on Connective Acquisition: A Growth Curve Analysis of English "Because" and German "Weil"

    ERIC Educational Resources Information Center

    van Veen, Rosie; Evers-Vermeul, Jacqueline; Sanders, Ted; van den Bergh, Huub

    2013-01-01

    The current study used growth curve analysis to study the role of input during the acquisition of the English causal connective "because" and its German counterpart "weil." The corpora of five German and five English children and their adult caretakers (age range 0;10-4;3) were analyzed for the amount as well as for the type of…

  18. Learned Attention in Adult Language Acquisition: A Replication and Generalization Study and Meta-Analysis

    ERIC Educational Resources Information Center

    Ellis, Nick C.; Sagarra, Nuria

    2011-01-01

    This study investigates associative learning explanations of the limited attainment of adult compared to child language acquisition in terms of learned attention to cues. It replicates and extends Ellis and Sagarra (2010) in demonstrating short- and long-term learned attention in the acquisition of temporal reference in Latin. In Experiment 1,…

  19. Library Catalog Log Analysis in E-Book Patron-Driven Acquisitions (PDA): A Case Study

    ERIC Educational Resources Information Center

    Urbano, Cristóbal; Zhang, Yin; Downey, Kay; Klingler, Thomas

    2015-01-01

    Patron-Driven Acquisitions (PDA) is a new model used for e-book acquisition by academic libraries. A key component of this model is to make records of ebooks available in a library catalog and let actual patron usage decide whether or not an item is purchased. However, there has been a lack of research examining the role of the library catalog as…

  20. Constraints on Parameter Setting: A Grammatical Analysis of Some Acquisition Stages in German Child Language.

    ERIC Educational Resources Information Center

    Clahsen, Harald

    1991-01-01

    Argues that to improve the parameter model as a theory of language acquisition it has to be constrained in several ways. Results on the acquisition of subject-verb agreement, verb placement, empty subjects, and negation in German child language are presented. (55 references) (JL)

  1. Optimal Diphthongs: An OT Analysis of the Acquisition of Spanish Diphthongs

    ERIC Educational Resources Information Center

    Krause, Alice

    2013-01-01

    This dissertation investigates the acquisition of Spanish diphthongs by adult native speakers of English. The following research questions will be addressed: 1) How do adult native speakers of English pronounce sequences of two vowels in their L2 Spanish at different levels of acquisition? 2) Can OT learnability models, specifically the GLA,…

  2. The Relationship between Previous Training in Computer Science and the Acquisition of Word Processing Skills.

    ERIC Educational Resources Information Center

    Monahan, Brian D.

    1986-01-01

    This study investigated whether computer science educational background makes secondary students more adept at using word processing capabilities, and compared computer science and non-computer science students' writing improvement with word processing use. Computer science students used more sophisticated program features but student writing did…

  3. Developmental Trends in Auditory Processing Can Provide Early Predictions of Language Acquisition in Young Infants

    ERIC Educational Resources Information Center

    Chonchaiya, Weerasak; Tardif, Twila; Mai, Xiaoqin; Xu, Lin; Li, Mingyan; Kaciroti, Niko; Kileny, Paul R.; Shao, Jie; Lozoff, Betsy

    2013-01-01

    Auditory processing capabilities at the subcortical level have been hypothesized to impact an individual's development of both language and reading abilities. The present study examined whether auditory processing capabilities relate to language development in healthy 9-month-old infants. Participants were 71 infants (31 boys and 40 girls) with…

  4. IWTU Process Sample Analysis Report

    SciTech Connect

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  5. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  6. A versatile and low-cost 3D acquisition and processing pipeline for collecting mass of archaeological findings on the field

    NASA Astrophysics Data System (ADS)

    Gattet, E.; Devogelaere, J.; Raffin, R.; Bergerot, L.; Daniel, M.; Jockey, Ph.; De Luca, L.

    2015-02-01

    In recent years, advances in the fields of photogrammetry and computer vision have produced several solutions for generating 3D reconstruction starting from simple images. Even if the potentialities of the image-based 3D reconstruction approach are nowadays very well-known in terms of reliability, accuracy and flexibility, there is still a lack of low-cost, open-source and automated solutions for collecting mass of archaeological findings, specially if one consider the real (and non theoretical) contextual aspects of a digitization campaign on the field (number of objects to acquire, available time, lighting conditions, equipment transport, budget, etc...) as well as the accuracy requirements for an in-depth shape analysis and classification purpose. In this paper we present a prototype system (integrating hardware and software) for the 3D acquisition, geometric reconstruction, documentation and archiving of large collections of archaeological findings. All the aspects of our approach are based on high-end image-based modeling techniques and designed basing on an accurate analysis of the typical field conditions of an archaeological campaign, as well as on the specific requirements of archaeological finding documentation and analysis. This paper presents all the aspects integrated into the prototype: - a hardware development of a transportable photobooth for the automated image acquisition consisting of a turntable and three DSLR controlled by a microcontroller; - an automatic image processing pipeline (based on Apero/Micmac) including mask generation, tie-point extraction, bundle adjustment, multi-view stereo correlation, point cloud generation, surface reconstruction; - a versatile (off-line/on-line) portable database for associating descriptive attributes (archaeological description) to the 3D digitizations on site; - a platform for data-gathering, archiving and sharing collections of 3D digitizations on the Web. The presentation and the assessment of this

  7. Hardware acceleration of lucky-region fusion (LRF) algorithm for image acquisition and processing

    NASA Astrophysics Data System (ADS)

    Maignan, William; Koeplinger, David; Carhart, Gary W.; Aubailly, Mathieu; Kiamilev, Fouad; Liu, J. Jiang

    2013-05-01

    "Lucky-region fusion" (LRF) is an image processing technique that has proven successful in enhancing the quality of images distorted by atmospheric turbulence. The LRF algorithm extracts sharp regions of an image obtained from a series of short exposure frames, and "fuses" them into a final image with improved quality. In previous research, the LRF algorithm had been implemented on a PC using a compiled programming language. However, the PC usually does not have sufficient processing power to handle real-time extraction, processing and reduction required when the LRF algorithm is applied not to single picture images but rather to real-time video from fast, high-resolution image sensors. This paper describes a hardware implementation of the LRF algorithm on a Virtex 6 field programmable gate array (FPGA) to achieve real-time video processing. The novelty in our approach is the creation of a "black box" LRF video processing system with a standard camera link input, a user controller interface, and a standard camera link output.

  8. Remote sensing data acquisition, analysis and archival. Volume 1. Final report

    SciTech Connect

    Stringer, W.J.; Dean, K.G.; Groves, J.E.

    1993-03-25

    The project specialized in the acquisition and dissemination of satellite imagery and its utilization for case-specific and statistical analyses of offshore environmental conditions, particularly those involving sea ice. During the duration of this contract, 854 Landsat Multispectral Scanner and 2 Landsat Thematic Mapper scenes, 8,576 Advanced Very High Resolution Radiometer images, and 31,000 European, Earth Resources Satellite, Synthetic Aperture Radar images were archived. Direct assistance was provided to eight Minerals Management Service (MMS)-sponsored studies, including analyses of Port Moller circulation, Bowhead whale migration, distribution, population and behavioral studies, Beaufort Sea fisheries, oil spill trajectory model development, and Kasegaluk Lagoon environmental assessments. In addition, under this Cooperative Agreement several complete studies were undertaken based on analysis of satellite imagery. The topics included: Kasegaluk Lagoon transport, the effect of winter storms on arctic ice, the relationship between ice surface temperatures as measured by buoys and passive microwave imagery, unusual cloud forms following lead-openings, and analyses of Chukchi and Bering sea polynyas.

  9. ICAMS: a new system for automated emulsion data acquisition and analysis

    SciTech Connect

    Arthur, A.A.; Brown, W.L. Jr.; Friedlander, E.M.; Heckman, H.H.; Jones, R.W.; Karant, Y.J.; Turney, A.D.

    1983-01-01

    ICAMS (Interactive Computer Assisted Measurement System) is designed to permit the acquisition and analysis of emulsion scan and measurement data at a rate much faster than any existing manual techniques. It accomplishes this by taking the burden of stage motion control and data recording away from the scanner and putting it onto the computer. It is a modern distributed network system, where a central PDP-11 computer running under RSX-11M V4 communicates with two-ported UNIBUS memory; each two-ported memory resides on the local intelligence of each independent ODS (Optical Data Station). The intelligence of each ODS is a 6512 microcomputer running under FORTH and enhanced with a floating point processor card. Each ODS is supported on ICAMS using the virtual memory features of FORTH, permitting full access to the disk storage facilities of the PDP system. To the scanner, each ODS is conversational and menu driven. To the physicist, utilities have been written that permit FORTRAN-77 programs to easily acquire and thus analyze the data base on the PDP-11.

  10. Photogrammetric 3d Acquisition and Analysis of Medicamentous Induced Pilomotor Reflex ("goose Bumps")

    NASA Astrophysics Data System (ADS)

    Schneider, D.; Hecht, A.

    2016-06-01

    In a current study at the University Hospital Dresden, Department of Neurology, the autonomous function of nerve fibres of the human skin is investigated. For this purpose, a specific medicament is applied on a small area of the skin of a test person which results in a local reaction (goose bumps). Based on the extent of the area, where the stimulation of the nerve fibres is visible, it can be concluded how the nerve function of the skin works. The aim of the investigation described in the paper is to generate 3D data of these goose bumps. Therefore, the paper analyses and compares different photogrammetric surface measurement techniques in regard to their suitability for the 3D acquisition of silicone imprints of the human skin. Furthermore, an appropriate processing procedure for analysing the recorded point cloud data is developed and presented. It was experimentally proven that by using (low-cost) photogrammetric techniques medicamentous induced goose bumps can be acquired in three dimensions and can be analysed almost fully automatically from the perspective of medical research questions. The relative accuracy was determined with 1% (RMSE) of the area resp. the volume of an individual goose bump.

  11. The Acquisition Process as a Vehicle for Enabling Knowledge Management in the Lifecycle of Complex Federal Systems

    NASA Technical Reports Server (NTRS)

    Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)

    2001-01-01

    This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.

  12. Lithography process window analysis with calibrated model

    NASA Astrophysics Data System (ADS)

    Zhou, Wenzhan; Yu, Jin; Lo, James; Liu, Johnson

    2004-05-01

    As critical-dimension shrink below 0.13 μm, the SPC (Statistical Process Control) based on CD (Critical Dimension) control in lithography process becomes more difficult. Increasing requirements of a shrinking process window have called on the need for more accurate decision of process window center. However in practical fabrication, we found that systematic error introduced by metrology and/or resist process can significantly impact the process window analysis result. Especially, when the simple polynomial functions are used to fit the lithographic data from focus exposure matrix (FEM), the model will fit these systematic errors rather than filter them out. This will definitely impact the process window analysis and determination of the best process condition. In this paper, we proposed to use a calibrated first principle model to do process window analysis. With this method, the systematic metrology error can be filtered out efficiently and give a more reasonable window analysis result.

  13. Data acquisition and processing history for the Explorer 33 (AIMP-D) satellite

    NASA Technical Reports Server (NTRS)

    Karras, T. J.

    1972-01-01

    The quality control monitoring system, using accounting and quality control data bases, made it possible to perform an in-depth analysis. Results show that the percentage of useable data files for experimenter analysis was 97.7%; only 0.4% of the data sequences supplied to the experimenter exhibited missing data. The 50 percentile probability delay values (referenced to station record data) indicate that the analog tapes arrived within 11 days, the data were digitized within 4.2 weeks, and the experimenter tapes were delivered in 8.95 weeks or less.

  14. Fast multi-dimensional NMR acquisition and processing using the sparse FFT.

    PubMed

    Hassanieh, Haitham; Mayzel, Maxim; Shi, Lixin; Katabi, Dina; Orekhov, Vladislav Yu

    2015-09-01

    Increasing the dimensionality of NMR experiments strongly enhances the spectral resolution and provides invaluable direct information about atomic interactions. However, the price tag is high: long measurement times and heavy requirements on the computation power and data storage. We introduce sparse fast Fourier transform as a new method of NMR signal collection and processing, which is capable of reconstructing high quality spectra of large size and dimensionality with short measurement times, faster computations than the fast Fourier transform, and minimal storage for processing and handling of sparse spectra. The new algorithm is described and demonstrated for a 4D BEST-HNCOCA spectrum. PMID:26123316

  15. The effect of age of acquisition, socioeducational status, and proficiency on the neural processing of second language speech sounds.

    PubMed

    Archila-Suerte, Pilar; Zevin, Jason; Hernandez, Arturo E

    2015-02-01

    This study investigates the role of age of acquisition (AoA), socioeducational status (SES), and second language (L2) proficiency on the neural processing of L2 speech sounds. In a task of pre-attentive listening and passive viewing, Spanish-English bilinguals and a control group of English monolinguals listened to English syllables while watching a film of natural scenery. Eight regions of interest were selected from brain areas involved in speech perception and executive processes. The regions of interest were examined in 2 separate two-way ANOVA (AoA×SES; AoA×L2 proficiency). The results showed that AoA was the main variable affecting the neural response in L2 speech processing. Direct comparisons between AoA groups of equivalent SES and proficiency level enhanced the intensity and magnitude of the results. These results suggest that AoA, more than SES and proficiency level, determines which brain regions are recruited for the processing of second language speech sounds. PMID:25528287

  16. Acquisition and Use of Software Products for Automatic Data Processing Systems in the Federal Government.

    ERIC Educational Resources Information Center

    Comptroller General of the U.S., Washington, DC.

    A description and analysis of numerous management problems pertaining to the substantial annual expenditures of the Government for computer software products together with recommendations to executive branch agencies for strengthening management practices are presented in this report. Appendix I provides an overview of the growth in use of (ADP)…

  17. The Effectiveness of Processing Instruction in L2 Grammar Acquisition: A Narrative Review

    ERIC Educational Resources Information Center

    Dekeyser, Robert; Botana, Goretti Prieto

    2015-01-01

    The past two decades have seen ample debate about processing instruction (PI) and its various components. In this article, we first describe what PI consists of and then address three questions: about the role of explicit information (EI) in PI, the difference between PI and teaching that incorporates production-based (PB) practice, and various…

  18. Analyzing Preschoolers' Overgeneralizations of Object Labeling in the Process of Mother-Tongue Acquisition in Turkey

    ERIC Educational Resources Information Center

    Kabadayi, Abdulkadir

    2006-01-01

    Language, as is known, is acquired under certain conditions: rapid and sequential brain maturation and cognitive development, the need to exchange information and to control others' actions, and an exposure to appropriate speech input. This research aims at analyzing preschoolers' overgeneralizations of the object labeling process in different…

  19. Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos

    ERIC Educational Resources Information Center

    Erfe, Jonathan P.; Lintao, Rachelle B.

    2012-01-01

    This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…

  20. How Explicit Knowledge Affects Online L2 Processing: Evidence from Differential Object Marking Acquisition

    ERIC Educational Resources Information Center

    Andringa, Sible; Curcic, Maja

    2015-01-01

    Form-focused instruction studies generally report larger gains for explicit types of instruction over implicit types on measures of controlled production. Studies that used online processing measures--which do not readily allow for the application of explicit knowledge--however, suggest that this advantage occurs primarily when the target…

  1. The RFP Process: Effective Management of the Acquisition of Library Materials.

    ERIC Educational Resources Information Center

    Wilkinson, Frances C.; Thorson, Connie Capers

    Many librarians view procurement, with its myriad forms, procedures, and other organizational requirements, as a tedious or daunting challenge. This book simplifies the process, showing librarians how to successfully prepare a Request for Proposal (RFP) and make informed decisions when determining which vendors to use for purchasing library…

  2. Production and Processing Asymmetries in the Acquisition of Tense Morphology by Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Chondrogianni, Vasiliki; Marinis, Theodoros

    2012-01-01

    This study investigates the production and online processing of English tense morphemes by sequential bilingual (L2) Turkish-speaking children with more than three years of exposure to English. Thirty-nine six- to nine-year-old L2 children and twenty-eight typically developing age-matched monolingual (L1) children were administered the production…

  3. Using Eye-Tracking to Investigate Topics in L2 Acquisition and L2 Processing

    ERIC Educational Resources Information Center

    Roberts, Leah; Siyanova-Chanturia, Anna

    2013-01-01

    Second language (L2) researchers are becoming more interested in both L2 learners' knowledge of the target language and how that knowledge is put to use during real-time language processing. Researchers are therefore beginning to see the importance of combining traditional L2 research methods with those that capture the moment-by-moment…

  4. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  5. A tailored 200 parameter VME based data acquisition system for IBA at the Lund Ion Beam Analysis Facility - Hardware and software

    NASA Astrophysics Data System (ADS)

    Elfman, Mikael; Ros, Linus; Kristiansson, Per; Nilsson, E. J. Charlotta; Pallon, Jan

    2016-03-01

    With the recent advances towards modern Ion Beam Analysis (IBA), going from one- or few-parameter detector systems to multi-parameter systems, it has been necessary to expand and replace the more than twenty years old CAMAC based system. A new VME multi-parameter (presently up to 200 channels) data acquisition and control system has been developed and implemented at the Lund Ion Beam Analysis Facility (LIBAF). The system is based on the VX-511 Single Board Computer (SBC), acting as master with arbiter functionality and consists of standard VME modules like Analog to Digital Converters (ADC's), Charge to Digital Converters (QDC's), Time to Digital Converters (TDC's), scaler's, IO-cards, high voltage and waveform units. The modules have been specially selected to support all of the present detector systems in the laboratory, with the option of future expansion. Typically, the detector systems consist of silicon strip detectors, silicon drift detectors and scintillator detectors, for detection of charged particles, X-rays and γ-rays. The data flow of the raw data buffers out from the VME bus to the final storage place on a 16 terabyte network attached storage disc (NAS-disc) is described. The acquisition process, remotely controlled over one of the SBCs ethernet channels, is also discussed. The user interface is written in the Kmax software package, and is used to control the acquisition process as well as for advanced online and offline data analysis through a user-friendly graphical user interface (GUI). In this work the system implementation, layout and performance are presented. The user interface and possibilities for advanced offline analysis are also discussed and illustrated.

  6. Microevolution Analysis of Bacillus coahuilensis Unveils Differences in Phosphorus Acquisition Strategies and Their Regulation

    PubMed Central

    Gómez-Lunar, Zulema; Hernández-González, Ismael; Rodríguez-Torres, María-Dolores; Souza, Valeria; Olmedo-Álvarez, Gabriela

    2016-01-01

    Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution). Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México), to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2,893) constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements (MGEs) or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo) biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with MGEs. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding alkaline phosphatase that

  7. Microevolution Analysis of Bacillus coahuilensis Unveils Differences in Phosphorus Acquisition Strategies and Their Regulation.

    PubMed

    Gómez-Lunar, Zulema; Hernández-González, Ismael; Rodríguez-Torres, María-Dolores; Souza, Valeria; Olmedo-Álvarez, Gabriela

    2016-01-01

    Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution). Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México), to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2,893) constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements (MGEs) or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo) biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with MGEs. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding alkaline phosphatase that

  8. REPEATED ACQUISITION OF RESPONSE SEQUENCES: THE ANALYSIS OF BEHAVIOR IN TRANSITION

    EPA Science Inventory

    Repeated acquisition (RA) procedures are behavioral preparations in which subjects are required to learn new response sequences with each experimental session. uch procedures avoid problems inherent in non-RA learning procedures. or example, with traditional paradigms, as the sub...

  9. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications.

    PubMed

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  10. A review of breast tomosynthesis. Part II. Image reconstruction, processing and analysis, and advanced applications

    PubMed Central

    Sechopoulos, Ioannis

    2013-01-01

    Many important post-acquisition aspects of breast tomosynthesis imaging can impact its clinical performance. Chief among them is the reconstruction algorithm that generates the representation of the three-dimensional breast volume from the acquired projections. But even after reconstruction, additional processes, such as artifact reduction algorithms, computer aided detection and diagnosis, among others, can also impact the performance of breast tomosynthesis in the clinical realm. In this two part paper, a review of breast tomosynthesis research is performed, with an emphasis on its medical physics aspects. In the companion paper, the first part of this review, the research performed relevant to the image acquisition process is examined. This second part will review the research on the post-acquisition aspects, including reconstruction, image processing, and analysis, as well as the advanced applications being investigated for breast tomosynthesis. PMID:23298127

  11. Development of a data acquisition and analysis system for nociceptive withdrawal reflex and reflex receptive fields in humans.

    PubMed

    Biurrun Manresa, Jose A; Hansen, John; Andersen, Ole K

    2010-01-01

    A system for data acquisition and analysis of nociceptive withdrawal reflex (NWR) and reflex receptive field (RRF) is introduced. The system is constituted by hardware and software components. The hardware consists of devices commonly used for electrical stimulation and electromyographic and kinematic data recording. The software comprises two different programs: Wirex, a stand-alone program developed in LabView for data acquisition, and Reflex Lab, a Matlab-based toolbox for data analysis. These programs were developed to maximize the potential of the hardware, turning it into a complete stimulation system capable of automatic quantification of NWR and RRF. In this article, a brief review of NWR and RRF analysis is presented, the system features are described in detail and its present and future applications are discussed. PMID:21096727

  12. Acquisition and Analysis of NASA Ames Sunphotometer Measurements during SAGE III Validation Campaigns and other Tropospheric and Stratospheric Research Missions

    NASA Technical Reports Server (NTRS)

    Livingston, John M.

    2004-01-01

    NASA Cooperative Agreement NCC2-1251 provided funding from April 2001 through December 2003 for Mr. John Livingston of SRI International to collaborate with NASA Ames Research Center scientists and engineers in the acquisition and analysis of airborne sunphotometer measurements during various atmospheric field studies. Mr. Livingston participated in instrument calibrations at Mauna Loa Observatory, pre-mission hardware and software preparations, acquisition and analysis of sunphotometer measurements during the missions, and post-mission analysis of data and reporting of scientific findings. The atmospheric field missions included the spring 2001 Intensive of the Asian Pacific Regional Aerosol Characterization Experiment (ACE-Asia), the Asian Dust Above Monterey-2003 (ADAM-2003) experiment, and the winter 2003 Second SAGE III Ozone Loss and Validation Experiment (SOLVE II).

  13. Human resource processes and the role of the human resources function during mergers and acquisitions in the electricity industry

    NASA Astrophysics Data System (ADS)

    Dass, Ted K.

    Mergers and acquisitions (M&A) have been a popular strategy for organizations to consolidate and grow for more than a century. However, research in this field indicates that M&A are more likely to fail than succeed, with failure rates estimated to be as high as 75%. People-related issues have been identified as important causes for the high failure rate, but these issues are largely neglected until after the deal is closed. One explanation for this neglect is the low involvement of human resource (HR) professionals and the HR function during the M&A process. The strategic HR management literature suggests that a larger role for HR professionals in the M&A process would enable organizations to identify potential problems early and devise appropriate solutions. However, empirical research from an HR perspective has been scarce in this area. This dissertation examines the role of the HR function and the HR processes followed in organizations during M&A. Employing a case-study research design, this study examines M&A undertaken by two large organizations in the electricity industry through the lens of a "process" perspective. Based on converging evidence, the case studies address three sets of related issues: (1) how do organizations undertake and manage M&A; (2) what is the extent of HR involvement in M&A and what role does it play in the M&A process; and (3) what factors explain HR involvement in the M&A process and, more generally, in the formulation of corporate goals and strategies. Results reveal the complexity of issues faced by organizations in undertaking M&A, the variety of roles played by HR professionals, and the importance of several key contextual factors---internal and external to the organization---that influence HR involvement in the M&A process. Further, several implications for practice and future research are explored.

  14. Forth system for coherent-scatter radar data acquisition and processing

    NASA Technical Reports Server (NTRS)

    Rennier, A. D.; Bowhill, S. A.

    1985-01-01

    A real time collection system was developed for the Urbana coherent scatter radar system. The new system, designed for use with a microcomputer, has several advantages over the old system implemented with a minicomputer. The software used to collect the data is described as well as the processing software used to analyze the data. In addition a magnetic tape format for coherent scatter data exchange is given.

  15. MDSplus data acquisition system

    SciTech Connect

    Stillerman, J.A.; Fredian, T.W.; Klare, K.; Manduchi, G.

    1997-01-01

    MDSplus, a tree based, distributed data acquisition system, was developed in collaboration with the ZTH Group at Los Alamos National Lab and the RFX Group at CNR in Padua, Italy. It is currently in use at MIT, RFX in Padua, TCV at EPFL in Lausanne, and KBSI in South Korea. MDSplus is made up of a set of X/motif based tools for data acquisition and display, as well as diagnostic configuration and management. It is based on a hierarchical experiment description which completely describes the data acquisition and analysis tasks and contains the results from these operations. These tools were designed to operate in a distributed, client/server environment with multiple concurrent readers and writers to the data store. While usually used over a Local Area Network, these tools can be used over the Internet to provide access for remote diagnosticians and even machine operators. An interface to a relational database is provided for storage and management of processed data. IDL is used as the primary data analysis and visualization tool. IDL is a registered trademark of Research Systems Inc. {copyright} {ital 1996 American Institute of Physics.}

  16. Broadband network on-line data acquisition system with web based interface for control and basic analysis

    NASA Astrophysics Data System (ADS)

    Polkowski, Marcin; Grad, Marek

    2016-04-01

    Passive seismic experiment "13BB Star" is operated since mid 2013 in northern Poland and consists of 13 broadband seismic stations. One of the elements of this experiment is dedicated on-line data acquisition system comprised of both client (station) side and server side modules with web based interface that allows monitoring of network status and provides tools for preliminary data analysis. Station side is controlled by ARM Linux board that is programmed to maintain 3G/EDGE internet connection, receive data from digitizer, send data do central server among with additional auxiliary parameters like temperatures, voltages and electric current measurements. Station side is controlled by set of easy to install PHP scripts. Data is transmitted securely over SSH protocol to central server. Central server is a dedicated Linux based machine. Its duty is receiving and processing all data from all stations including auxiliary parameters. Server side software is written in PHP and Python. Additionally, it allows remote station configuration and provides web based interface for user friendly interaction. All collected data can be displayed for each day and station. It also allows manual creation of event oriented plots with different filtering abilities and provides numerous status and statistic information. Our solution is very flexible and easy to modify. In this presentation we would like to share our solution and experience. National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  17. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  18. Processing Temporal Constraints and Some Implications for the Investigation of Second Language Sentence Processing and Acquisition. Commentary on Baggio

    ERIC Educational Resources Information Center

    Roberts, Leah

    2008-01-01

    Baggio presents the results of an event-related potential (ERP) study in which he examines the processing consequences of reading tense violations such as *"Afgelopen zondag lakt Vincent de kozijnen van zijn landhuis" (*"Last Sunday Vincent paints the window-frames of his country house"). The violation is arguably caused by a mismatch between the…

  19. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Technical Reports Server (NTRS)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  20. Age of second language acquisition affects nonverbal conflict processing in children: an fMRI study

    PubMed Central

    Mohades, Seyede Ghazal; Struys, Esli; Van Schuerbeek, Peter; Baeken, Chris; Van De Craen, Piet; Luypaert, Robert

    2014-01-01

    Background In their daily communication, bilinguals switch between two languages, a process that involves the selection of a target language and minimization of interference from a nontarget language. Previous studies have uncovered the neural structure in bilinguals and the activation patterns associated with performing verbal conflict tasks. One question that remains, however is whether this extra verbal switching affects brain function during nonverbal conflict tasks. Methods In this study, we have used fMRI to investigate the impact of bilingualism in children performing two nonverbal tasks involving stimulus–stimulus and stimulus–response conflicts. Three groups of 8–11-year-old children – bilinguals from birth (2L1), second language learners (L2L), and a control group of monolinguals (1L1) – were scanned while performing a color Simon and a numerical Stroop task. Reaction times and accuracy were logged. Results Compared to monolingual controls, bilingual children showed higher behavioral congruency effect of these tasks, which is matched by the recruitment of brain regions that are generally used in general cognitive control, language processing or to solve language conflict situations in bilinguals (caudate nucleus, posterior cingulate gyrus, STG, precuneus). Further, the activation of these areas was found to be higher in 2L1 compared to L2L. Conclusion The coupling of longer reaction times to the recruitment of extra language-related brain areas supports the hypothesis that when dealing with language conflicts the specialization of bilinguals hampers the way they can process with nonverbal conflicts, at least at early stages in life. PMID:25328840

  1. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    NASA Astrophysics Data System (ADS)

    Li, Y. T.; Wittenberg, L. J.

    1992-09-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  2. An overview of AmeriFlux data products and methods for data acquisition, processing, and publication

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Poindexter, C.; Agarwal, D.; Papale, D.; van Ingen, C.; Torn, M. S.

    2014-12-01

    The AmeriFlux network encompasses independently managed field sites measuring ecosystem carbon, water, and energy fluxes across the Americas. In close coordination with ICOS in Europe, a new set of fluxes data and metadata products is being produced and released at the FLUXNET level, including all AmeriFlux sites. This will enable continued releases of global standardized set of flux data products. In this release, new formats, structures, and ancillary information are being proposed and adopted. This presentation discusses these aspects, detailing current and future solutions. One of the major revisions was to the BADM (Biological, Ancillary, and Disturbance Metadata) protocols. The updates include structure and variable changes to address new developments in data collection related to flux towers and facilitate two-way data sharing. In particular, a new organization of templates is now in place, including changes in templates for biomass, disturbances, instrumentation, soils, and others. New variables and an extensive addition to the vocabularies used to describe BADM templates allow for a more flexible and comprehensible coverage of field sites and the data collection methods and results. Another extensive revision is in the data formats, levels, and versions for fluxes and micrometeorological data. A new selection and revision of data variables and an integrated new definition for data processing levels allow for a more intuitive and flexible notation for the variety of data products. For instance, all variables now include positional information that is tied to BADM instrumentation descriptions. This allows for a better characterization of spatial representativeness of data points, e.g., individual sensors or the tower footprint. Additionally, a new definition for data levels better characterizes the types of processing and transformations applied to the data across different dimensions (e.g., spatial representativeness of a data point, data quality checks

  3. Digital image processing: a primer for JVIR authors and readers: part 2: digital image acquisition.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-11-01

    This is the second installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first article of the series, we reviewed the fundamentals of digital image architecture. In this article, we describe the ways that an author can import digital images to the computer desktop. We explore the modern imaging network and explain how to import picture archiving and communications systems (PACS) images to the desktop. Options and techniques for producing digital hard copy film are also presented. PMID:14605101

  4. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I

    2015-07-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf adults who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native signers demonstrated early and robust activation of sublexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  5. Data acquisition and processing using noncontact/contact digitizing systems for reverse engineering

    NASA Astrophysics Data System (ADS)

    Motavalli, Saeid; Suharitdamrong, V.

    1994-03-01

    Reverse engineering is the process of creating an engineering design model for existing parts or prototypes. We have developed a reverse engineering system where data is acquired with a scanning system that combines noncontact and contact digitizing methods. The noncontact sensor is a PC-based vision system that views the part from orthogonal orientations and captures the boundary points of the object. The images are then vectorized and a 2D CAD drawing of the part is created. The contact probe is mounted on a CNC machine, which is then guided by the NC code based on the 2D drawings of the part and captures the 3D coordinates of the points inside the boundaries of the object. The 3D coordinates are then used by the surface-modeling module of the system to create a 3D CAD drawing of the part, which is presented in a commercial CAD system. By combining vision sensing with contact probing we achieved speed and accuracy in the data extraction process. This paper describes the elements of the system and the CAD modeling procedure.

  6. Digital-image processing and image analysis of glacier ice

    USGS Publications Warehouse

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  7. Data Acquisition and Prompt Analysis System for High Altitude Balloon Experiments

    NASA Technical Reports Server (NTRS)

    Sarkady, A. A.; Chupp, E. L.; Dickey, J. W.

    1968-01-01

    An inexpensive and simple data acquisition system has been developed for balloon borne experiments and has been tested with a gamma ray detector in a balloon flight launched from Palestine, Texas. The detector used for the test consisted of an NaI(T1) scintillation crystal encased in a 1/8 in. plastic scintillator-charged particle shield. The combination was viewed by a single photomultiplier and charged particle gating was accomplished by a conventional phoswich discriminator. The pulse height analysis of the NaI events, not associated with prompt charged particle interactions, is accomplished by converting to a time spectrum using an airborne height to time converter. A range of pulse widths from 5 microseconds to 250 microseconds corresponds to energy losses in NaI from 100 to 1000 keV. The time spectrum information, along with charged particle events and barometric pressure, is fed to a mixer which modulates a 252.4 Mc FM transmitter. The original scintillator spectrum is recovered on the ground utilizing conversion circuitry at the receiver video output and a 128 channel commercial pulse height analyzer. The charged particle events of standard time width are stored with the spectrum at a fixed channel position and are therefore represented by a sharp line riding on the lower part of the NaI energy loss spectrum. An energy loss greater than 1000 keV is presented by the maximum pulse width of the converter and stored in the last analyzer channel. Barometric pressure data is transmitted by low frequency modulation of the sme FM carrier. In flight operation, the receiver video output can be recorded on a wide band tape recorder and simultaneously analyzed by the 128 channel analyzer, or the telemetered data can be analyzed later. The flight system features high pulse resolution, essentially instantaneous time response, high data rate, and flexibility; and is of modest cost. A detailed description of the system and operating performance is discussed.

  8. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  9. Programme evaluation training for health professionals in francophone Africa: process, competence acquisition and use

    PubMed Central

    Ridde, Valéry; Fournier, Pierre; Banza, Baya; Tourigny, Caroline; Ouédraogo, Dieudonné

    2009-01-01

    Background While evaluation is, in theory, a component of training programmes in health planning, training needs in this area remain significant. Improving health systems necessarily calls for having more professionals who are skilled in evaluation. Thus, the Université de Ouagadougou (Burkina Faso) and the Université de Montréal (Canada) have partnered to establish, in Burkina Faso, a master's-degree programme in population and health with a course in programme evaluation. This article describes the four-week (150-hour) course taken by two cohorts (2005–2006/2006–2007) of health professionals from 11 francophone African countries. We discuss how the course came to be, its content, its teaching processes and the master's programme results for students. Methods The conceptual framework was adapted from Kirkpatrick's (1996) four-level evaluation model: reaction, learning, behaviour, results. Reaction was evaluated based on a standardized questionnaire for all the master's courses and lessons. Learning and behaviour competences were assessed by means of a questionnaire (pretest/post-test, one year after) adapted from the work of Stevahn L, King JA, Ghere G, Minnema J: Establishing Essential Competencies for Program Evaluators. Am J Eval 2005, 26(1):43–59. Master's programme effects were tested by comparing the difference in mean scores between times (before, after, one year after) using pretest/post-test designs. Paired sample tests were used to compare mean scores. Results The teaching is skills-based, interactive and participative. Students of the first cohort gave the evaluation course the highest score (4.4/5) for overall satisfaction among the 16 courses (3.4–4.4) in the master's programme. What they most appreciated was that the forms of evaluation were well adapted to the content and format of the learning activities. By the end of the master's programme, both cohorts of students considered that they had greatly improved their mastery of the 60

  10. Requirements of data acquisition and analysis for condensed matter studies at the weapons neutron research/proton storage ring facility

    SciTech Connect

    Johnson, M.W.; Goldstone, J.A.; Taylor, A.D.

    1982-11-01

    With the completion of the proton storage ring (PSR) in 1985, the subsquent increase in neutron flux, and the continuing improvement in neutron scattering instruments, a significant improvement in data acquisition and data analysis capabilities will be required. A brief account of the neutron source is given together with the associated neutron scattering instruments. Based on current technology and operating instruments, a projection for 1985 to 1990 of the neutron scattering instruments and their main parameters are given. From the expected data rates and the projected instruments, the size of data storage is estimated and the user requirements are developed. General requirements are outlined with specific requirements in user hardware and software stated. A project time scale to complete the data acquisition and analysis system by 1985 is given.

  11. High-throughput analysis of type I-E CRISPR/Cas spacer acquisition in E. coli

    PubMed Central

    Savitskaya, Ekaterina; Semenova, Ekaterina; Dedkov, Vladimir; Metlitskaya, Anastasia; Severinov, Konstantin

    2013-01-01

    In Escherichia coli, the acquisition of new CRISPR spacers is strongly stimulated by a priming interaction between a spacer in CRISPR RNA and a protospacer in foreign DNA. Priming also leads to a pronounced bias in DNA strand from which new spacers are selected. Here, ca. 200,000 spacers acquired during E. coli type I-E CRISPR/Cas-driven plasmid elimination were analyzed. Analysis of positions of plasmid protospacers from which newly acquired spacers have been derived is inconsistent with spacer acquisition machinery sliding along the target DNA as the primary mechanism responsible for strand bias during primed spacer acquisition. Most protospacers that served as donors of newly acquired spacers during primed spacer acquisition had an AAG protospacer adjacent motif, PAM. Yet, the introduction of multiple AAG sequences in the target DNA had no effect on the choice of protospacers used for adaptation, which again is inconsistent with the sliding mechanism. Despite a strong preference for an AAG PAM during CRISPR adaptation, the AAG (and CTT) triplets do not appear to be avoided in known E. coli phages. Likewise, PAM sequences are not avoided in Streptococcus thermophilus phages, indicating that CRISPR/Cas systems may not have been a strong factor in shaping host-virus interactions. PMID:23619643

  12. Acquisition and processing of data for isotope-ratio-monitoring mass spectrometry

    NASA Technical Reports Server (NTRS)

    Ricci, M. P.; Merritt, D. A.; Freeman, K. H.; Hayes, J. M.

    1994-01-01

    Methods are described for continuous monitoring of signals required for precise analyses of 13C, 18O, and 15N in gas streams containing varying quantities of CO2 and N2. The quantitative resolution (i.e. maximum performance in the absence of random errors) of these methods is adequate for determination of isotope ratios with an uncertainty of one part in 10(5); the precision actually obtained is often better than one part in 10(4). This report describes data-processing operations including definition of beginning and ending points of chromatographic peaks and quantitation of background levels, allowance for effects of chromatographic separation of isotopically substituted species, integration of signals related to specific masses, correction for effects of mass discrimination, recognition of drifts in mass spectrometer performance, and calculation of isotopic delta values. Characteristics of a system allowing off-line revision of parameters used in data reduction are described and an algorithm for identification of background levels in complex chromatograms is outlined. Effects of imperfect chromatographic resolution are demonstrated and discussed and an approach to deconvolution of signals from coeluting substances described.

  13. Model-based estimation of breast percent density in raw and processed full-field digital mammography images from image-acquisition physics and patient-image characteristics

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Nathan, Diane L.; Conant, Emily F.; Kontos, Despina

    2012-03-01

    Breast percent density (PD%), as measured mammographically, is one of the strongest known risk factors for breast cancer. While the majority of studies to date have focused on PD% assessment from digitized film mammograms, digital mammography (DM) is becoming increasingly common, and allows for direct PD% assessment at the time of imaging. This work investigates the accuracy of a generalized linear model-based (GLM) estimation of PD% from raw and postprocessed digital mammograms, utilizing image acquisition physics, patient characteristics and gray-level intensity features of the specific image. The model is trained in a leave-one-woman-out fashion on a series of 81 cases for which bilateral, mediolateral-oblique DM images were available in both raw and post-processed format. Baseline continuous and categorical density estimates were provided by a trained breast-imaging radiologist. Regression analysis is performed and Pearson's correlation, r, and Cohen's kappa, κ, are computed. The GLM PD% estimation model performed well on both processed (r=0.89, p<0.001) and raw (r=0.75, p<0.001) images. Model agreement with radiologist assigned density categories was also high for processed (κ=0.79, p<0.001) and raw (κ=0.76, p<0.001) images. Model-based prediction of breast PD% could allow for a reproducible estimation of breast density, providing a rapid risk assessment tool for clinical practice.

  14. The Blanco Cosmology Survey: Data Acquisition, Processing, Calibration, Quality Diagnostics and Data Release

    SciTech Connect

    Desai, S.; Armstrong, R.; Mohr, J.J.; Semler, D.R.; Liu, J.; Bertin, E.; Allam, S.S.; Barkhouse, W.A.; Bazin, G.; Buckley-Geer, E.J.; Cooper, M.C.; /UC, Irvine /Lick Observ. /UC, Santa Cruz

    2012-04-01

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha},{delta})= (5 hr, -55{sup circ} and 23 hr, -55{sup circ}). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out PSF corrected model fitting photometry for all detected objects. The median 10{sigma} galaxy (point source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6) and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 milli-arcsec. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from 2MASS which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematics floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7% and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta} z/(1+z)=0.054 with an outlier fraction {eta}<5% to z{approx}1. We highlight some selected science results to date and provide a full description of the released data products.

  15. THE BLANCO COSMOLOGY SURVEY: DATA ACQUISITION, PROCESSING, CALIBRATION, QUALITY DIAGNOSTICS, AND DATA RELEASE

    SciTech Connect

    Desai, S.; Mohr, J. J.; Semler, D. R.; Liu, J.; Bazin, G.; Zenteno, A.; Armstrong, R.; Bertin, E.; Allam, S. S.; Buckley-Geer, E. J.; Lin, H.; Tucker, D.; Barkhouse, W. A.; Cooper, M. C.; Hansen, S. M.; High, F. W.; Lin, Y.-T.; Ngeow, C.-C.; Rest, A.; Song, J.

    2012-09-20

    The Blanco Cosmology Survey (BCS) is a 60 night imaging survey of {approx}80 deg{sup 2} of the southern sky located in two fields: ({alpha}, {delta}) = (5 hr, -55 Degree-Sign ) and (23 hr, -55 Degree-Sign ). The survey was carried out between 2005 and 2008 in griz bands with the Mosaic2 imager on the Blanco 4 m telescope. The primary aim of the BCS survey is to provide the data required to optically confirm and measure photometric redshifts for Sunyaev-Zel'dovich effect selected galaxy clusters from the South Pole Telescope and the Atacama Cosmology Telescope. We process and calibrate the BCS data, carrying out point-spread function-corrected model-fitting photometry for all detected objects. The median 10{sigma} galaxy (point-source) depths over the survey in griz are approximately 23.3 (23.9), 23.4 (24.0), 23.0 (23.6), and 21.3 (22.1), respectively. The astrometric accuracy relative to the USNO-B survey is {approx}45 mas. We calibrate our absolute photometry using the stellar locus in grizJ bands, and thus our absolute photometric scale derives from the Two Micron All Sky Survey, which has {approx}2% accuracy. The scatter of stars about the stellar locus indicates a systematic floor in the relative stellar photometric scatter in griz that is {approx}1.9%, {approx}2.2%, {approx}2.7%, and {approx}2.7%, respectively. A simple cut in the AstrOmatic star-galaxy classifier spread{sub m}odel produces a star sample with good spatial uniformity. We use the resulting photometric catalogs to calibrate photometric redshifts for the survey and demonstrate scatter {delta}z/(1 + z) = 0.054 with an outlier fraction {eta} < 5% to z {approx} 1. We highlight some selected science results to date and provide a full description of the released data products.

  16. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  17. An architecture for real time data acquisition and online signal processing for high throughput tandem mass spectrometry

    SciTech Connect

    Shah, Anuj R.; Jaitly, Navdeep; Zuljevic, Nino; Monroe, Matthew E.; Liyu, Andrei V.; Polpitiya, Ashoka D.; Adkins, Joshua N.; Belov, Mikhail E.; Anderson, Gordon A.; Smith, Richard D.; Gorton, Ian

    2010-12-09

    Independent, greedy collection of data events using simple heuristics results in massive over-sampling of the prominent data features in large-scale studies over what should be achievable through “intelligent,” online acquisition of such data. As a result, data generated are more aptly described as a collection of a large number of small experiments rather than a true large-scale experiment. Nevertheless, achieving “intelligent,” online control requires tight interplay between state-of-the-art, data-intensive computing infrastructure developments and analytical algorithms. In this paper, we propose a Software Architecture for Mass spectrometry-based Proteomics coupled with Liquid chromatography Experiments (SAMPLE) to develop an “intelligent” online control and analysis system to significantly enhance the information content from each sensor (in this case, a mass spectrometer). Using online analysis of data events as they are collected and decision theory to optimize the collection of events during an experiment, we aim to maximize the information content generated during an experiment by the use of pre-existing knowledge to optimize the dynamic collection of events.

  18. Performance analysis for the expanding search PN acquisition algorithm. [pseudonoise in spread spectrum transmission

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1982-01-01

    An approach is described for approximating the cumulative probability distribution of the acquisition time of the serial pseudonoise (PN) search algorithm. The results are applicable to both variable and fixed dwell time systems. The theory is developed for the case where some a priori information is available on the PN code epoch (reacquisition problem or acquisition of very long codes). Also considered is the special case of a search over the whole code. The accuracy of the approximation is demonstrated by comparisons with published exact results for the fixed dwell time algorithm.

  19. Facilitating the Acquisition of Sensorimotor Behavior with a Microcomputer-Mediated Teaching System: An Experimental Analysis.

    ERIC Educational Resources Information Center

    Horn, Eva M.; Warren, Steven F.

    1987-01-01

    The effects of a combined neuromotor/behavioral approach using a microcomputer mediated teaching system on the acquisition of basic motor skills by two young children (17- and 24-months-old) with severe and multiple disabilities were examined. Increased frequency and duration of target behaviors in both training and generalizing settings were…

  20. Border Crossings? Exploring the Intersection of Second Language Acquisition, Conversation Analysis, and Foreign Language Pedagogy

    ERIC Educational Resources Information Center

    Mori, Junko

    2007-01-01

    This article explores recent changes in the landscape of second language acquisition (SLA) and foreign language pedagogical (FLP) research. Firth and Wagner's (1997) proposal for the reconceptualization of SLA has been supported by SLA and FLP researchers who share the sentiment concerning the need for increased attention to social and contextual…

  1. Morphological Awareness in Literacy Acquisition of Chinese Second Graders: A Path Analysis

    ERIC Educational Resources Information Center

    Zhang, Haomin

    2016-01-01

    The present study tested a path diagram regarding the contribution of morphological awareness (MA) to early literacy acquisition among Chinese-speaking second graders (N = 123). Three facets of MA were addressed, namely derivational awareness, compound awareness and compound structure awareness. The model aimed to test a theory of causal order…

  2. LIFTING BEAM DESIGN/ANALYSIS FOR THE DATA ACQUISITION AND CONTROL SYSTEM TRAILER

    SciTech Connect

    MACKEY TC; BENEGAS TR

    1993-03-15

    This supporting document details calculations completed to properly design an adjustable lifting beam. The main use of the lifting beam is to hoist the Data Acquisition and Controls Systems (DACS) trailer over a steam line. All design work was completed using the American Institute of Steel Construction, Manual of Steel Construction (AISC, 1989) and Hanford Hoisting and Rigging Manual (WHC, 1992).

  3. The Effects of Hypertext Glosses on L2 Vocabulary Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Yun, Jee Hwan

    2010-01-01

    In the field of second language acquisition (SLA), "comprehensible input" (Krashen, 1985) has been considered a critical factor to help learners acquire foreign and second languages (L2). From this perspective, the notion of extensive or free voluntary reading (Day & Bamford, 1998; Krashen, 1993) has emerged that L2 learners should be given more…

  4. Some Thoughts on the Contrastive Analysis of Features in Second Language Acquisition

    ERIC Educational Resources Information Center

    Lardiere, Donna

    2009-01-01

    In this article I discuss the selection and assembly of formal features in second language acquisition. Assembling the particular lexical items of a second language (L2) requires that the learner reconfigure features from the way these are represented in the first language (L1) into new formal configurations on possibly quite different types of…

  5. Analysis of the Effect a Student-Centred Mobile Learning Instructional Method Has on Language Acquisition

    ERIC Educational Resources Information Center

    Oberg, Andrew; Daniels, Paul

    2013-01-01

    In this study a self-paced instructional method based on the use of Apple's iPod Touch personal mobile devices to deliver content was compared with a group-oriented instructional method of content delivery in terms of learner acquisition of course material. One hundred and twenty-two first-year Japanese university students in four classes were…

  6. Improving Data Analysis in Second Language Acquisition by Utilizing Modern Developments in Applied Statistics

    ERIC Educational Resources Information Center

    Larson-Hall, Jenifer; Herrington, Richard

    2010-01-01

    In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…

  7. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  8. Since When or How Often? Dissociating the Roles of Age of Acquisition (AoA) and Lexical Frequency in Early Visual Word Processing

    ERIC Educational Resources Information Center

    Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado

    2013-01-01

    The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…

  9. NMDA Receptor-Dependent Processes in the Medial Prefrontal Cortex Are Important for Acquisition and the Early Stage of Consolidation during Trace, but Not Delay Eyeblink Conditioning

    ERIC Educational Resources Information Center

    Takehara-Nishiuchi, Kaori; Kawahara, Shigenori; Kirino, Yutaka

    2005-01-01

    Permanent lesions in the medial prefrontal cortex (mPFC) affect acquisition of conditioned responses (CRs) during trace eyeblink conditioning and retention of remotely acquired CRs. To clarify further roles of the mPFC in this type of learning, we investigated the participation of the mPFC in mnemonic processes both during and after daily…

  10. Acquisition and integration of low vision assistive devices: understanding the decision-making process of older adults with low vision.

    PubMed

    Copolillo, Al; Teitelman, Jodi L

    2005-01-01

    The purpose of this study was to describe how older adults with low vision make decisions to use low vision assistive devices (LVADs). Analysis of participants' narratives, from both group and individual interviews, revealed three topic areas affecting device use. Two are discussed in this paper: Experiences and Characteristics Leading to Successful LVAD Use Decision Making and Challenges to Successful LVAD Use Decision Making. The third, Adjustment to Low Vision Disability, is briefly discussed. Of particular importance to occupational therapy practitioners in the growing field of low vision rehabilitation was the value placed on low vision rehabilitation services to assist with acquiring devices and integrating them into daily routines. Occupational therapy services were highly regarded. Participants demonstrated the importance of becoming a part of a supportive network of people with low vision to gain access to information about resources. They emphasized the need for systems and policy changes to reduce barriers to making informed decisions about LVAD use. Results indicate that occupational therapists working in low vision can support clients by facilitating development of a support network, acting as liaisons between clients and other health practitioners, especially ophthalmologists, and encouraging policy development that supports barrier-free LVAD acquisition and use. These topics should be incorporated into continuing and entry-level education to prepare practitioners for leadership in the field of low vision rehabilitation. PMID:15969278

  11. Floodnet: a telenetwork for acquisition, processing and dissemination of earth observation data for monitoring and emergency management of floods

    NASA Astrophysics Data System (ADS)

    Blyth, Ken

    1997-08-01

    The aim of FLOODNET is to provide a communications and data distribution facility specifically designed to meet the demanding temporal requirements of flood monitoring within the European Union (EU). Currently, remotely sensed data are not fully utilized for flood applications because potential users are not familiar with the procedure for acquiring the data and do not have a defined route for obtaining help in processing and interpreting the data. FLOODNET will identify the potential user groups within the EU and will, by demonstration, education and the use of telematics, increase the awareness of users to the capabilities of earth observation (EO) and the means by which they can acquire EO data. FLOODNET will act as a filter between users and satellite operation planners to help assign priorities for data acquisition against previously agreed criteria. The network will encourage a user community and will facilitate cross-sector information transfer, particularly between flood experts and administrative decision makers. The requirement for two levels of flood mapping is identified: (1) a rapid, broad-brush approach to assess the general flood situation and identify areas at greatest risk and in need of immediate assistance; (2) a detailed mapping approach, less critical in time, suitable for input to hydrological models or for flood risk evaluation. A likely networking technology is outlined, the basic functionality of a FLOODNET demonstrator is described and some of the economic benefits of the network are identified.

  12. Integrated Processing of High Resolution Topographic Data for Soil Erosion Assessment Considering Data Acquisition Schemes and Surface Properties

    NASA Astrophysics Data System (ADS)

    Eltner, A.; Schneider, D.; Maas, H.-G.

    2016-06-01

    Soil erosion is a decisive earth surface process strongly influencing the fertility of arable land. Several options exist to detect soil erosion at the scale of large field plots (here 600 m²), which comprise different advantages and disadvantages depending on the applied method. In this study, the benefits of unmanned aerial vehicle (UAV) photogrammetry and terrestrial laser scanning (TLS) are exploited to quantify soil surface changes. Beforehand data combination, TLS data is co-registered to the DEMs generated with UAV photogrammetry. TLS data is used to detect global as well as local errors in the DEMs calculated from UAV images. Additionally, TLS data is considered for vegetation filtering. Complimentary, DEMs from UAV photogrammetry are utilised to detect systematic TLS errors and to further filter TLS point clouds in regard to unfavourable scan geometry (i.e. incidence angle and footprint) on gentle hillslopes. In addition, surface roughness is integrated as an important parameter to evaluate TLS point reliability because of the increasing footprints and thus area of signal reflection with increasing distance to the scanning device. The developed fusion tool allows for the estimation of reliable data points from each data source, considering the data acquisition geometry and surface properties, to finally merge both data sets into a single soil surface model. Data fusion is performed for three different field campaigns at a Mediterranean field plot. Successive DEM evaluation reveals continuous decrease of soil surface roughness, reappearance of former wheel tracks and local soil particle relocation patterns.

  13. Variation in the application of natural processes: language-dependent constraints in the phonological acquisition of bilingual children.

    PubMed

    Faingold, E D

    1996-09-01

    This paper studies phonological processes and constraints on early phonological and lexical development, as well as the strategies employed by a young Spanish-, Portuguese-, and Hebrew-speaking child-Nurit (the author's niece)-in the construction of her early lexicon. Nurit's linguistic development is compared to that of another Spanish-, Portuguese-, and Hebrew-speaking child-Noam (the author's son). Noam and Nurit's linguistic development is contrasted to that of Berman's (1977) English- and Hebrew-speaking daughter (Shelli). The simultaneous acquisition of similar (closely related languages) such as Spanish and Portuguese versus that of nonrelated languages such as English and Hebrew yields different results: Children acquiring similar languages seem to prefer maintenance as a strategy for the construction of their early lexicon, while children exposed to nonrelated languages appear to prefer reduction to a large extent (Faingold, 1990). The Spanish- and Portuguese-speaking children's high accuracy stems from a wider choice of target words, where the diachronic development of two closely related languages provides a simplified model lexicon to the child. PMID:8865623

  14. Real-time acquisition and data analysis of skeletal muscle contraction in a multi-user environment.

    PubMed

    Lieber, R L; Smith, D E; Campbell, R C; Hargens, A R

    1986-06-01

    A data acquisition system is described which acquires data from contracting skeletal muscle. The system is designed to run in a multi-user environment while acquiring contractile data in real-time. Time dedicated solely to laboratory experiments is thus eliminated. A menu-driver is included to allow users to enter experimental commands with or without command arguments. Error monitoring functions prevent operator errors from causing data loss. Data storage in both ASCII and binary formats maximizes file flexibility, readability and accessibility. Finally, an on-line tutorial and help facility is provided for user training. The system developed is applicable to any experimental environment involving data acquisition, storage and analysis. PMID:3637122

  15. Analysis of factors determining enterprise value of company merger and acquisition: A case study of coal in Kalimantan, Indonesia

    NASA Astrophysics Data System (ADS)

    Candra, Ade; Pasasa, Linus A.; Simatupang, Parhimpunan

    2015-09-01

    The main purpose of this paper is looking at the relationship between the factors of technical, financial and legal with enterprise value in mergers and acquisitions of coal companies in Kalimantan, Indonesia over the last 10 years. Data obtained from secondary data sources in the company works and from published data on the internet. The data thus obtained are as many as 46 secondary data with parameters resources, reserves, stripping ratio, calorific value, distance from pit to port, and distance from ports to vessels, production per annum, the cost from pit to port, from port to vessel costs, royalties, coal price and permit status. The data was analysis using structural equation modeling (SEM) to determine the factors that most significant influence enterprise value of coal company in Kalimantan. The result shows that a technical matter is the factor that most affects the value of enterprise in coal merger and acquisition company. Financial aspect is the second factor that affects the enterprise value.

  16. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions

    PubMed Central

    Shenoy, Shailesh M.

    2016-01-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software’s support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity. PMID:27516727

  17. The effect of signal acquisition and processing choices on ApEn values: towards a "gold standard" for distinguishing effort levels from isometric force records.

    PubMed

    Forrest, Sarah M; Challis, John H; Winter, Samantha L

    2014-06-01

    Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. PMID:24725708

  18. Data acquisition and analysis in the DOE/NASA Wind Energy Program

    NASA Technical Reports Server (NTRS)

    Neustadter, H. E.

    1980-01-01

    Four categories of data systems, each responding to a distinct information need are presented. The categories are: control, technology, engineering and performance. The focus is on the technology data system which consists of the following elements: sensors which measure critical parameters such as wind speed and direction, output power, blade loads and strains, and tower vibrations; remote multiplexing units (RMU) mounted on each wind turbine which frequency modulate, multiplex and transmit sensor outputs; the instrumentation available to record, process and display these signals; and centralized computer analysis of data. The RMU characteristics and multiplexing techniques are presented. Data processing is illustrated by following a typical signal through instruments such as the analog tape recorder, analog to digital converter, data compressor, digital tape recorder, video (CRT) display, and strip chart recorder.

  19. Meta-analysis using Dirichlet process.

    PubMed

    Muthukumarana, Saman; Tiwari, Ram C

    2016-02-01

    This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. PMID:22802045

  20. Data Processing and Analysis Systems for JT-60U

    SciTech Connect

    Matsuda, T.; Totsuka, T.; Tsugita, T.; Oshima, T.; Sakata, S.; Sato, M.; Iwasaki, K.

    2002-09-15

    The JT-60U data processing system is a large computer complex gradually modernized by utilizing progressive computer and network technology. A main computer using state-of-the-art CMOS technology can handle {approx}550 MB of data per discharge. A gigabit ethernet switch with FDDI ports has been introduced to cope with the increase of handling data. Workstation systems with VMEbus serial highway drivers for CAMAC have been developed and used to replace many minicomputer systems. VMEbus-based fast data acquisition systems have also been developed to enlarge and replace a minicomputer system for mass data.The JT-60U data analysis system is composed of a JT-60U database server and a JT-60U analysis server, which are distributed UNIX servers. The experimental database is stored in the 1TB RAID disk of the JT-60U database server and is composed of ZENKEI and diagnostic databases. Various data analysis tools are available on the JT-60U analysis server. For the remote collaboration, technical features of the data analysis system have been applied to the computer system to access JT-60U data via the Internet. Remote participation in JT-60U experiments has been successfully conducted since 1996.

  1. CAPTAN: A hardware architecture for integrated data acquisition, control, and analysis for detector development

    SciTech Connect

    Turqueti, Marcos; Rivera, Ryan A.; Prosser, Alan; Andresen, Jeffry; Chramowicz, John; /Fermilab

    2008-11-01

    The Electronic Systems Engineering Department of the Computing Division at the Fermi National Accelerator Laboratory has developed a data acquisition system flexible and powerful enough to meet the needs of a variety of high energy physics applications. The system described in this paper is called CAPTAN (Compact And Programmable daTa Acquisition Node) and its architecture and capabilities are presented in detail here. The three most important characteristics of this system are flexibility, versatility and scalability. These three main features are supported by key architectural features; a vertical bus that permits the user to stack multiple boards, a gigabit Ethernet link that permits high speed communications to the system and the core group of boards that provide specific capabilities for the system. In this paper, we describe the system architecture, give an overview of its capabilities and point out possible applications.

  2. Enhanced Data-Acquisition System

    NASA Technical Reports Server (NTRS)

    Mustain, Roy W.

    1990-01-01

    Time-consuming, costly digitization of analog signals on magnetic tape eliminated. Proposed data-acquisition system provides nearly immediate access to data in incoming signals by digitizing and recording them both on magnetic tape and on optical disk. Tape and/or disk later played back to reconstruct signals in analog or digital form for analysis. Of interest in industrial and scientific applications in which necessary to digitize, store, and/or process large quantities of experimental data.

  3. Further Thoughts on Parameters and Features in Second Language Acquisition: A Reply to Peer Comments on Lardiere's "Some Thoughts on the Contrastive Analysis of Features in Second Language Acquisition" in SLR 25(2)

    ERIC Educational Resources Information Center

    Lardiere, Donna

    2009-01-01

    In this article, Lardiere responds to peer comments regarding her earlier article "Some Thoughts on the Contrastive Analysis of Features in Second Language Acquisition" (EJ831786). Lardiere acknowledges the reviewers' thoughtful contributions and expert expansion on various facets of the original article. While she states that it is clear from the…

  4. Artificial intelligence applied to process signal analysis

    NASA Technical Reports Server (NTRS)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  5. Empirical Analysis of Effects of Bank Mergers and Acquisitions on Small Business Lending in Nigeria

    NASA Astrophysics Data System (ADS)

    Ita, Asuquo Akabom

    2012-11-01

    Mergers and acquisitions are the major instruments of the recent banking reforms in Nigeria.The effects and the implications of the reforms on the lending practices of merged banks to small businesses were considered in this study. These effects were divided into static and dynamic effects (restructuring, direct and external). Data were collected by cross-sectional research design and were subsequently analyzed by the ordinary least square (OLS) method.The analyses show that bank size, financial characteristics and deposit of non-merged banks are positively related to small business lending. While for the merged banks, the reverse is the case. From the above result, it is evident that merger and acquisition have not only static effect on small business lending but also dynamic effect, therefore, given the central position of small businesses in the current government policy on industrialization in Nigeria, policy makers in Nigeria, should consider both the static and dynamic effects of merger and acquisition on small business lending in their policy thrust.

  6. Analysis of time series from stochastic processes

    PubMed

    Gradisek; Siegert; Friedrich; Grabec

    2000-09-01

    Analysis of time series from stochastic processes governed by a Langevin equation is discussed. Several applications for the analysis are proposed based on estimates of drift and diffusion coefficients of the Fokker-Planck equation. The coefficients are estimated directly from a time series. The applications are illustrated by examples employing various synthetic time series and experimental time series from metal cutting. PMID:11088809

  7. Encapsulation Processing and Manufacturing Yield Analysis

    NASA Technical Reports Server (NTRS)

    Willis, P. B.

    1984-01-01

    The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.

  8. Summary of process research analysis efforts

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  9. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  10. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  11. Analysis of physical processes via imaging vectors

    NASA Astrophysics Data System (ADS)

    Volovodenko, V.; Efremova, N.; Efremov, V.

    2016-06-01

    Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

  12. CPAS Preflight Drop Test Analysis Process

    NASA Technical Reports Server (NTRS)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  13. Planar Inlet Design and Analysis Process (PINDAP)

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  14. A real-time digital control, data acquisition and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.; Campbell, G. L.; Carlstrom, T. N.; Deboo, J. C.; Hsieh, C.-L.; Snider, R. T.; Trost, P. K.

    1990-10-01

    A VME-based real time computer systems for laser control, data acquisition and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to 8 Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in real time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a 'burst mode', where all available (fully charged) lasers can be fired at 50 to 100 msec intervals upon receipt of an external event trigger signal. One of more cpu modules, along with a LeCroy FERA (Fast Encoding and Readout ADC) system, will perform real time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 msec following each laser pulse. The VME-based computer system consists of 2 or more target processor modules (25 MHz Motorola 68030) running the VMEexec real time operating system connected to a Unix based Host system (also a 68030). All real time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  15. A real-time digital control, data acquisition and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    SciTech Connect

    Greenfield, C.M.; Campbell, G.L.; Carlstrom, T.N.; DeBoo, J.C.; Hsieh, C.-L.; Snider, R.T.; Trost, P.K.

    1990-10-01

    A VME-based real-time computer systems for laser control, data acquisition and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to 8 Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a burst mode'', where all available (fully charged) lasers can be fired at 50--100 {mu}sec intervals upon receipt of an external event trigger signal. One of more cpu modules, along with a LeCroy FERA (Fast Encoding and Readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 msec following each laser pulse. The VME-based computer system consists of 2 or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet. 17 refs., 1 fig.

  16. Real-time digital control, data acquisition, and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.; Campbell, G. L.; Carlstrom, T. N.; DeBoo, J. C.; Hsieh, C.-L.; Snider, R. T.; Trost, P. K.

    1990-10-01

    A VME-based real-time computer system for laser control, data acquisition, and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to eight Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in a real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a ``burst mode,'' where all available (fully charged) lasers can be fired at 50-100 μs intervals upon receipt of an external event trigger signal. One or more cpu modules, along with a LeCroy FERA (fast encoding and readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 ms following each laser pulse. The VME-based computer system consists of two or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix-based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non-real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  17. Real-time digital control, data acquisition, and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    SciTech Connect

    Greenfield, C.M.; Campbell, G.L.; Carlstrom, T.N.; DeBoo, J.C.; Hsieh, C.; Snider, R.T.; Trost, P.K. )

    1990-10-01

    A VME-based real-time computer system for laser control, data acquisition, and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to eight Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in a real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a burst mode,'' where all available (fully charged) lasers can be fired at 50--100 {mu}s intervals upon receipt of an external event trigger signal. One or more cpu modules, along with a LeCroy FERA (fast encoding and readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 ms following each laser pulse. The VME-based computer system consists of two or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix-based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non-real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  18. Transition of NOAA's GPS-Met Data Acquisition and Processing System to the Commercial Sector: Inital Results

    NASA Astrophysics Data System (ADS)

    Jackson, Michael; Blatt, Stephan; Holub, Kirk

    2015-04-01

    In April of 2014, NOAA/OAR/ESRL Global Systems Division (GSD) and Trimble, in collaboration with Earth Networks, Inc. (ENI) signed a Cooperative Research and Development Agreement (CRADA) to transfer the existing NOAA GPS-Met Data Acquisition and Processing System (GPS-Met DAPS) technology to a commercial Trimble/ENI partnership. NOAA's GPS-Met DAPS is currently operated in a pseudo-operational mode but has proven highly reliable and running at over 95% uptime. The DAPS uses the GAMIT software to ingest dual frequency carrier phase GPS/GNSS observations and ancillary information such as real-time satellite orbits to estimate the zenith-scaled tropospheric (ZTD) signal delays and, where surface MET data are available, retrieve integrated precipitable water vapor (PWV). The NOAA data and products are made available to end users in near real-time. The Trimble/ENI partnership will use the Trimble Pivot™ software with the Atmosphere App to calculate zenith tropospheric (ZTD), tropospheric slant delay, and integrated precipitable water vapor (PWV). Evaluation of the Trimble software is underway starting with a comparison of ZTD and PWV values determined from four sub networks of GPS stations located 1. near NOAA Radiosonde Observation (Upper-Air Observation) launch sites; 2. Stations with low terrain/high moisture variability (Gulf Coast); 3. Stations with high terrain/low moisture variability (Southern California); and 4. Stations with high terrain/high moisture variability (high terrain variability elev. > 1000m). For each network GSD and T/ENI run the same stations for 30 days, compare results, and perform an evaluation of the long-term solution accuracy, precision and reliability. Metrics for success include T/ENI PWV estimates within 1.5 mm of ESRL/GSD's estimates 95% of the time (ZTD uncertainty of less than 10 mm 95% of the time). The threshold for allowable variations in ZTD between NOAA GPS-Met and T/ENI processing are 10mm. The CRADA 1&2 Trimble processing

  19. Complete data acquisition and analysis system for low-energy electron-molecule collision studies

    NASA Astrophysics Data System (ADS)

    Nag, Pamir; Nandi, Dhananjay

    2015-09-01

    A complete data acquisition system has been developed that can work with any personal computer irrespective of the operating system installed on it. The software can be used in low and intermediate electron-energy collision studies with ground-state molecules in gas phase using a combination of RS-232, GPIB, and USB-interfaced devices. Various tabletop instruments and nuclear instrumentation module (NIM) -based electronics have been interfaced and have communicated with the software, which is based on LabVIEW. This is tested with dissociative electron attachment (DEA) and polar dissociation studies to oxygen molecule and successfully used in a DEA study of carbon monoxide and carbon dioxide.

  20. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  1. SCDU testbed automated in-situ alignment, data acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-07-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easilyparseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  2. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  3. Applying 'Sequential Windowed Acquisition of All Theoretical Fragment Ion Mass Spectra' (SWATH) for systematic toxicological analysis with liquid chromatography-high-resolution tandem mass spectrometry.

    PubMed

    Arnhard, Kathrin; Gottschall, Anna; Pitterl, Florian; Oberacher, Herbert

    2015-01-01

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become an indispensable analytical technique in clinical and forensic toxicology for detection and identification of potentially toxic or harmful compounds. Particularly, non-target LC-MS/MS assays enable extensive and universal screening requested in systematic toxicological analysis. An integral part of the identification process is the generation of information-rich product ion spectra which can be searched against libraries of reference mass spectra. Usually, 'data-dependent acquisition' (DDA) strategies are applied for automated data acquisition. In this study, the 'data-independent acquisition' (DIA) method 'Sequential Windowed Acquisition of All Theoretical Fragment Ion Mass Spectra' (SWATH) was combined with LC-MS/MS on a quadrupole-quadrupole-time-of-flight (QqTOF) instrument for acquiring informative high-resolution tandem mass spectra. SWATH performs data-independent fragmentation of all precursor ions entering the mass spectrometer in 21m/z isolation windows. The whole m/z range of interest is covered by continuous stepping of the isolation window. This allows numerous repeat analyses of each window during the elution of a single chromatographic peak and results in a complete fragment ion map of the sample. Compounds and samples typically encountered in forensic casework were used to assess performance characteristics of LC-MS/MS with SWATH. Our experiments clearly revealed that SWATH is a sensitive and specific identification technique. SWATH is capable of identifying more compounds at lower concentration levels than DDA does. The dynamic range of SWATH was estimated to be three orders of magnitude. Furthermore, the >600,000 SWATH spectra matched led to only 408 incorrect calls (false positive rate = 0.06 %). Deconvolution of generated ion maps was found to be essential for unravelling the full identification power of LC-MS/MS with SWATH. With the available software, however, only semi

  4. Steam Generator Group Project. Progress report on data acquisition/statistical analysis

    SciTech Connect

    Doctor, P.G.; Buchanan, J.A.; McIntyre, J.M.; Hof, P.J.; Ercanbrack, S.S.

    1984-01-01

    A major task of the Steam Generator Group Project (SGGP) is to establish the reliability of the eddy current inservice inspections of PWR steam generator tubing, by comparing the eddy current data to the actual physical condition of the tubes via destructive analyses. This report describes the plans for the computer systems needed to acquire, store and analyze the diverse data to be collected during the project. The real-time acquisition of the baseline eddy current inspection data will be handled using a specially designed data acquisition computer system based on a Digital Equipment Corporation (DEC) PDP-11/44. The data will be archived in digital form for use after the project is completed. Data base management and statistical analyses will be done on a DEC VAX-11/780. Color graphics will be heavily used to summarize the data and the results of the analyses. The report describes the data that will be taken during the project and the statistical methods that will be used to analyze the data. 7 figures, 2 tables.

  5. Sneak analysis applied to process systems

    NASA Astrophysics Data System (ADS)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  6. The Effect of Age of Second Language Acquisition on the Representation and Processing of Second Language Words

    ERIC Educational Resources Information Center

    Silverberg, Stu; Samuel, Arthur G.

    2004-01-01

    In this study, the effects of second language (i.e., L2) proficiency and age of second language acquisition are assessed. Three types of bilinguals are compared: Early L2 learners, Late highly proficient L2 learners, and Late less proficient L2 learners. A lexical decision priming paradigm is used in which the critical trials consist of first…

  7. The Symbolic World of the Bilingual Child: Digressions on Language Acquisition, Culture and the Process of Thinking

    ERIC Educational Resources Information Center

    Nowak-Fabrykowski, Krystyna; Shkandrij, Miroslav

    2004-01-01

    In this paper we explore the relationship between language acquisition, and the construction of a symbolic world. According to Bowers (1989) language is a collection of patterns regulating social life. This conception is close to that of Symbolic Interactionists (Charon, 1989) who see society as made up of interacting individuals who are symbol…

  8. The Influence of Type and Token Frequency on the Acquisition of Affixation Patterns: Implications for Language Processing

    ERIC Educational Resources Information Center

    Endress, Ansgar D.; Hauser, Marc D.

    2011-01-01

    Rules, and exceptions to such rules, are ubiquitous in many domains, including language. Here we used simple artificial grammars to investigate the influence of 2 factors on the acquisition of rules and their exceptions, namely type frequency (the relative numbers of different exceptions to different regular items) and token frequency (the number…

  9. Acquisition of the Linearization Process in Text Composition in Third to Ninth Graders: Effects of Textual Superstructure and Macrostructural Organization

    ERIC Educational Resources Information Center

    Favart, Monik; Coirier, Pierre

    2006-01-01

    o complementary experiments analyzed the acquisition of text content linearization in writing, in French-speaking participants from third to ninth grades. In both experiments, a scrambled text paradigm was used: eleven ideas presented in random order had to be rearranged coherently so as to compose a text. Linearization was analyzed on the basis…

  10. Selecting the optimal anti-aliasing filter for multichannel biosignal acquisition intended for inter-signal phase shift analysis.

    PubMed

    Keresnyei, Róbert; Megyeri, Péter; Zidarics, Zoltán; Hejjel, László

    2015-01-01

    The availability of microcomputer-based portable devices facilitates the high-volume multichannel biosignal acquisition and the analysis of their instantaneous oscillations and inter-signal temporal correlations. These new, non-invasively obtained parameters can have considerable prognostic or diagnostic roles. The present study investigates the inherent signal delay of the obligatory anti-aliasing filters. One cycle of each of the 8 electrocardiogram (ECG) and 4 photoplethysmogram signals from healthy volunteers or artificially synthesised series were passed through 100-80-60-40-20 Hz 2-4-6-8th order Bessel and Butterworth filters digitally synthesized by bilinear transformation, that resulted in a negligible error in signal delay compared to the mathematical model of the impulse- and step responses of the filters. The investigated filters have as diverse a signal delay as 2-46 ms depending on the filter parameters and the signal slew rate, which is difficult to predict in biological systems and thus difficult to compensate for. Its magnitude can be comparable to the examined phase shifts, deteriorating the accuracy of the measurement. As a conclusion, identical or very similar anti-aliasing filters with lower orders and higher corner frequencies, oversampling, and digital low pass filtering are recommended for biosignal acquisition intended for inter-signal phase shift analysis. PMID:25514627

  11. Observation and analysis of lunar occultations of stars with an emphasis on improvements to data acquisition instrumentation and reduction techniques

    SciTech Connect

    Schneider, G.H.

    1985-01-01

    A program of observation and analysis of lunar occultations was conceived, developed, and carried out using the facilities of the University of Florida's Rosemary Hill Observatory (RHO). The successful implementation of the program required investigation into several related areas. First, after an upgrade to the RHO 76-cm. reflecting telescope, a microprocessor controlled fast photoelectric data acquisition system was designed and built for the occultation data acquisition task. Second, the currently available model-fitting techniques used in the analysis of occultation observations were evaluated. A number of numerical experiments on synthesized and observational data were carried out to improve the performance of the numerical techniques. Among the numerical methods investigated were solution schemes employing partial parametric adjustment, parametric grouping into computational subsets (randomly and on the basis the correlation coefficients), and preprocessing of the observational data by a number of smoothing techniques for a variety of noise conditions. Third, a turn-key computational software system, incorporating data transfer, reduction, graphics, and display, was developed to carry out all the necessary and related computational tasks in an interactive environment. Twenty-four occultation observations were obtained during the period March 1983 to March 1984.

  12. SU-C-18C-06: Radiation Dose Reduction in Body Interventional Radiology: Clinical Results Utilizing a New Imaging Acquisition and Processing Platform

    SciTech Connect

    Kohlbrenner, R; Kolli, KP; Taylor, A; Kohi, M; Fidelman, N; LaBerge, J; Kerlan, R; Gould, R

    2014-06-01

    Purpose: To quantify the patient radiation dose reduction achieved during transarterial chemoembolization (TACE) procedures performed in a body interventional radiology suite equipped with the Philips Allura Clarity imaging acquisition and processing platform, compared to TACE procedures performed in the same suite equipped with the Philips Allura Xper platform. Methods: Total fluoroscopy time, cumulative dose area product, and cumulative air kerma were recorded for the first 25 TACE procedures performed to treat hepatocellular carcinoma (HCC) in a Philips body interventional radiology suite equipped with Philips Allura Clarity. The same data were collected for the prior 85 TACE procedures performed to treat HCC in the same suite equipped with Philips Allura Xper. Mean values from these cohorts were compared using two-tailed t tests. Results: Following installation of the Philips Allura Clarity platform, a 42.8% reduction in mean cumulative dose area product (3033.2 versus 1733.6 mGycm∧2, p < 0.0001) and a 31.2% reduction in mean cumulative air kerma (1445.4 versus 994.2 mGy, p < 0.001) was achieved compared to similar procedures performed in the same suite equipped with the Philips Allura Xper platform. Mean total fluoroscopy time was not significantly different between the two cohorts (1679.3 versus 1791.3 seconds, p = 0.41). Conclusion: This study demonstrates a significant patient radiation dose reduction during TACE procedures performed to treat HCC after a body interventional radiology suite was converted to the Philips Allura Clarity platform from the Philips Allura Xper platform. Future work will focus on evaluation of patient dose reduction in a larger cohort of patients across a broader range of procedures and in specific populations, including obese patients and pediatric patients, and comparison of image quality between the two platforms. Funding for this study was provided by Philips Healthcare, with 5% salary support provided to authors K. Pallav

  13. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  14. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges.

    PubMed

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  15. Values Acquisition: Some Critical Distinctions and Implications.

    ERIC Educational Resources Information Center

    Herman, William E.

    1997-01-01

    Examines critical issues in values education. Outlines the bipolar distinctions between the developmental and transmission theories of values acquisition and describes an integrative model that incorporates both approaches. Offers suggestions for parents, counselors, schools, and society based upon a process-and-product analysis of a…

  16. Sustainability Analysis for Products and Processes

    EPA Science Inventory

    Sustainability Analysis for Products and Processes Subhas K. Sikdar National Risk Management Research Laboratory United States Environmental protection Agency 26 W. M.L. King Dr. Cincinnati, OH 45237 Sikdar.subhas@epa.gov ABSTRACT Claims of both sustainable and unsu...

  17. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  18. Process analysis of fluidized bed granulation.

    PubMed

    Rantanen, J; Jørgensen, A; Räsänen, E; Luukkonen, P; Airaksinen, S; Raiman, J; Hänninen, K; Antikainen, O; Yliruusi, J

    2001-01-01

    This study assesses the fluidized bed granulation process for the optimization of a model formulation using in-line near-infrared (NIR) spectroscopy for moisture determination. The granulation process was analyzed using an automated granulator and optimization of the verapamil hydrochloride formulation was performed using a mixture design. The NIR setup with a fixed wavelength detector was applied for moisture measurement. Information from other process measurements, temperature difference between process inlet air and granules (T(diff)), and water content of process air (AH), was also analyzed. The application of in-line NIR provided information related to the amount of water throughout the whole granulation process. This information combined with trend charts of T(diff) and AH enabled the analysis of the different process phases. By this means, we can obtain in-line documentation from all the steps of the processing. The choice of the excipient affected the nature of the solid-water interactions; this resulted in varying process times. NIR moisture measurement combined with temperature and humidity measurements provides a tool for the control of water during fluid bed granulation. PMID:14727858

  19. Recovering the dynamics of root growth and development using novel image acquisition and analysis methods

    PubMed Central

    Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.

    2012-01-01

    Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394

  20. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  1. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  2. Uncertainty-Based Approach for Dynamic Aerodynamic Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Heim, Eugene H. D.; Bandon, Jay M.

    2004-01-01

    Development of improved modeling methods to provide increased fidelity of flight predictions for aircraft motions during flight in flow regimes with large nonlinearities requires improvements in test techniques for measuring and characterizing wind tunnel data. This paper presents a method for providing a measure of data integrity for static and forced oscillation test techniques. Data integrity is particularly important when attempting to accurately model and predict flight of today s high performance aircraft which are operating in expanded flight envelopes, often maneuvering at high angular rates at high angles-of-attack, even above maximum lift. Current aerodynamic models are inadequate in predicting flight characteristics in the expanded envelope, such as rapid aircraft departures and other unusual motions. Present wind tunnel test methods do not factor changes of flow physics into data acquisition schemes, so in many cases data are obtained over more iterations than required, or insufficient data may be obtained to determine a valid estimate with statistical significance. Additionally, forced oscillation test techniques, one of the primary tools used to develop dynamic models, do not currently provide estimates of the uncertainty of the results during an oscillation cycle. A method to optimize the required number of forced oscillation cycles based on decay of uncertainty gradients and balance tolerances is also presented.

  3. Impact of Personal Relevance on Acquisition and Generalization of Script Training for Aphasia: A Preliminary Analysis

    PubMed Central

    Kaye, Rosalind C.; Lee, Jaime B.; van Vuuren, Sarel

    2015-01-01

    Purpose The importance of personalization in script training in aphasia has been assumed but never tested. This study compared acquisition and generalization of personally relevant versus generic words or phrases appearing in the same scripts. Method Eight individuals (6 men; 2 women) with chronic aphasia received 3 weeks of intensive computer-based script training. For each participant, 2 scripts, a trained and an untrained generalization script, were embedded with 4 personally relevant word choices and 2–4 generic items that were similar across participants. Scripts were probed for accuracy at baseline and posttreatment. Significance testing was conducted on baseline and posttreatment scores, and for gains in personally relevant versus generic items. Effect sizes were computed. Results Both personally relevant and generic items improved significantly on trained scripts. Improvements on untrained scripts were smaller, with only personally relevant items reaching significance. There was no significant difference on gains made on personalized versus generic items for trained scripts (p = .059), but the effect size was large (d = 0.90). For generalization scripts, this effect was small (d = 0.25) and nonsignificant. Conclusions Personally relevant words and phrases were acquired, although not generalized, more successfully than generic words and phrases. Data supports the importance of personalization in script training, but the degree of that importance requires further investigation. PMID:26340806

  4. A comparative analysis of multichannel Data Acquisition Systems for quality assurance in external beam radiation therapy

    NASA Astrophysics Data System (ADS)

    Fuduli, I.; Porumb, C.; Espinoza, A. A.; Aldosari, A. H.; Carolan, M.; Lerch, M. L. F.; Metcalfe, P.; Rosenfeld, A. B.; Petasecca, M.

    2014-06-01

    The paper presents a comparative study performed by the Centre of Medical Radiation Physics (CMRP) on three multichannel Data Acquisition Systems (DAQ) based on different analogue front-ends to suit a wide range of radiotherapy applications. The three front-ends are: a charge-to-frequency converter developed by INFN Torino, an electrometer and a charge-to-digital converter (both commercial devices from Texas Instruments). For the first two (named DAQ A and B), the CMRP has designed the read-out systems whilst the third one (DAQ C) comes with its own evaluation board. For the purpose of the characterization DAQ A and DAQ B have been equipped with 128 channels while DAQ C has 256 channels. In terms of performances, the DAQs show good linearity over all the dynamic range. Each one has a different range of sensitivity ranging from less than 1 pC up to 13 nC, which makes the three front-ends complementary and suitable for use with different radiation detectors for different radiotherapy applications, or in a mixed solution which can house different front-ends.

  5. Parametric analysis of the liquid hydrogen and nitrogen bubble point pressure for cryogenic liquid acquisition devices

    NASA Astrophysics Data System (ADS)

    Hartwig, Jason; Adin Mann, Jay; Darr, Samuel R.

    2014-09-01

    This paper presents the parametric investigation of the factors which govern screen channel liquid acquisition device bubble point pressure in a low pressure propellant tank. The five test parameters that were varied included the screen mesh, liquid cryogen, liquid temperature and pressure, and type of pressurant gas. Bubble point data was collected using three fine mesh 304 stainless steel screens in two different liquids (hydrogen and nitrogen), over a broad range of liquid temperatures and pressures in subcooled and saturated liquid states, using both a noncondensible (helium) and autogenous (hydrogen or nitrogen) gas pressurization scheme. Bubble point pressure scales linearly with surface tension, but does not scale inversely with the fineness of the mesh. Bubble point pressure increases proportional to the degree of subcooling. Higher bubble points are obtained using noncondensible pressurant gases over the condensable vapor. The bubble point model is refined using a temperature dependent pore diameter of the screen to account for screen shrinkage at reduced liquid temperatures and to account for relative differences in performance between the two pressurization schemes. The updated bubble point model can be used to accurately predict performance of LADs operating in future cryogenic propellant engines and cryogenic fuel depots.

  6. Robot Acquisition of Active Maps Through Teleoperation and Vector Space Analysis

    NASA Technical Reports Server (NTRS)

    Peters, Richard Alan, II

    2003-01-01

    The work performed under this contract was in the area of intelligent robotics. The problem being studied was the acquisition of intelligent behaviors by a robot. The method was to acquire action maps that describe tasks as sequences of reflexive behaviors. Action maps (a.k.a. topological maps) are graphs whose nodes represent sensorimotor states and whose edges represent the motor actions that cause the robot to proceed from one state to the next. The maps were acquired by the robot after being teleoperated or otherwise guided by a person through a task several times. During a guided task, the robot records all its sensorimotor signals. The signals from several task trials are partitioned into episodes of static behavior. The corresponding episodes from each trial are averaged to produce a task description as a sequence of characteristic episodes. The sensorimotor states that indicate episode boundaries become the nodes, and the static behaviors, the edges. It was demonstrated that if compound maps are constructed from a set of tasks then the robot can perform new tasks in which it was never explicitly trained.

  7. Analysis of rocket engine injection combustion processes

    NASA Technical Reports Server (NTRS)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  8. Infrared image processing and data analysis

    NASA Astrophysics Data System (ADS)

    Ibarra-Castanedo, C.; González, D.; Klein, M.; Pilla, M.; Vallerand, S.; Maldague, X.

    2004-12-01

    Infrared thermography in nondestructive testing provides images (thermograms) in which zones of interest (defects) appear sometimes as subtle signatures. In this context, raw images are not often appropriate since most will be missed. In some other cases, what is needed is a quantitative analysis such as for defect detection and characterization. In this paper, presentation is made of various methods of data analysis required either at preprocessing and/or processing images. References from literature are provided for briefly discussed known methods while novelties are elaborated in more details within the text which include also experimental results.

  9. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF).

    PubMed

    Vasan, S N Swetadri; Ionita, Ciprian N; Titus, A H; Cartwright, A N; Bednarek, D R; Rudin, S

    2012-02-23

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame. PMID:24027619

  10. Graphics processing unit (GPU) implementation of image processing algorithms to improve system performance of the control acquisition, processing, and image display system (CAPIDS) of the micro-angiographic fluoroscope (MAF)

    NASA Astrophysics Data System (ADS)

    Swetadri Vasan, S. N.; Ionita, Ciprian N.; Titus, A. H.; Cartwright, A. N.; Bednarek, D. R.; Rudin, S.

    2012-03-01

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame.

  11. Graphics Processing Unit (GPU) implementation of image processing algorithms to improve system performance of the Control, Acquisition, Processing, and Image Display System (CAPIDS) of the Micro-Angiographic Fluoroscope (MAF)

    PubMed Central

    Vasan, S.N. Swetadri; Ionita, Ciprian N.; Titus, A.H.; Cartwright, A.N.; Bednarek, D.R.; Rudin, S.

    2012-01-01

    We present the image processing upgrades implemented on a Graphics Processing Unit (GPU) in the Control, Acquisition, Processing, and Image Display System (CAPIDS) for the custom Micro-Angiographic Fluoroscope (MAF) detector. Most of the image processing currently implemented in the CAPIDS system is pixel independent; that is, the operation on each pixel is the same and the operation on one does not depend upon the result from the operation on the other, allowing the entire image to be processed in parallel. GPU hardware was developed for this kind of massive parallel processing implementation. Thus for an algorithm which has a high amount of parallelism, a GPU implementation is much faster than a CPU implementation. The image processing algorithm upgrades implemented on the CAPIDS system include flat field correction, temporal filtering, image subtraction, roadmap mask generation and display window and leveling. A comparison between the previous and the upgraded version of CAPIDS has been presented, to demonstrate how the improvement is achieved. By performing the image processing on a GPU, significant improvements (with respect to timing or frame rate) have been achieved, including stable operation of the system at 30 fps during a fluoroscopy run, a DSA run, a roadmap procedure and automatic image windowing and leveling during each frame. PMID:24027619

  12. Digital interactive image analysis by array processing

    NASA Technical Reports Server (NTRS)

    Sabels, B. E.; Jennings, J. D.

    1973-01-01

    An attempt is made to draw a parallel between the existing geophysical data processing service industries and the emerging earth resources data support requirements. The relationship of seismic data analysis to ERTS data analysis is natural because in either case data is digitally recorded in the same format, resulting from remotely sensed energy which has been reflected, attenuated, shifted and degraded on its path from the source to the receiver. In the seismic case the energy is acoustic, ranging in frequencies from 10 to 75 cps, for which the lithosphere appears semi-transparent. In earth survey remote sensing through the atmosphere, visible and infrared frequency bands are being used. Yet the hardware and software required to process the magnetically recorded data from the two realms of inquiry are identical and similar, respectively. The resulting data products are similar.

  13. Spike-train acquisition, analysis and real-time experimental control using a graphical programming language (LabView).

    PubMed

    Nordstrom, M A; Mapletoft, E A; Miles, T S

    1995-11-01

    A solution is described for the acquisition on a personal computer of standard pulses derived from neuronal discharge, measurement of neuronal discharge times, real-time control of stimulus delivery based on specified inter-pulse interval conditions in the neuronal spike train, and on-line display and analysis of the experimental data. The hardware consisted of an Apple Macintosh IIci computer and a plug-in card (National Instruments NB-MIO16) that supports A/D, D/A, digital I/O and timer functions. The software was written in the object-oriented graphical programming language LabView. Essential elements of the source code of the LabView program are presented and explained. The use of the system is demonstrated in an experiment in which the reflex responses to muscle stretch are assessed for a single motor unit in the human masseter muscle. PMID:8750090

  14. Automatic processing, analysis, and recognition of images

    NASA Astrophysics Data System (ADS)

    Abrukov, Victor S.; Smirnov, Evgeniy V.; Ivanov, Dmitriy G.

    2004-11-01

    New approaches and computer codes (A&CC) for automatic processing, analysis and recognition of images are offered. The A&CC are based on presentation of object image as a collection of pixels of various colours and consecutive automatic painting of distinguished itself parts of the image. The A&CC have technical objectives centred on such direction as: 1) image processing, 2) image feature extraction, 3) image analysis and some others in any consistency and combination. The A&CC allows to obtain various geometrical and statistical parameters of object image and its parts. Additional possibilities of the A&CC usage deal with a usage of artificial neural networks technologies. We believe that A&CC can be used at creation of the systems of testing and control in a various field of industry and military applications (airborne imaging systems, tracking of moving objects), in medical diagnostics, at creation of new software for CCD, at industrial vision and creation of decision-making system, etc. The opportunities of the A&CC are tested at image analysis of model fires and plumes of the sprayed fluid, ensembles of particles, at a decoding of interferometric images, for digitization of paper diagrams of electrical signals, for recognition of the text, for elimination of a noise of the images, for filtration of the image, for analysis of the astronomical images and air photography, at detection of objects.

  15. Phonological processing deficits and the acquisition of the alphabetic principle in a severely delayed reader: a case study.

    PubMed

    Penney, Catherine G; Drover, James; Dyck, Carrie

    2009-11-01

    At the end of first grade, TM did not know the alphabet and could read no words. He could not tap syllables in words, had difficulty producing rhyming words and retrieving the phonological representations of words, and he could not discriminate many phoneme contrasts. He learned letter-sound correspondences first for single-consonant onsets and then later for the final consonant in a word but had difficulty with letter-sound associations for vowels. TM's ability to select a printed word to match a spoken word on the basis of the initial or final letter and sound was interpreted as evidence of Ehri's phonetic-cue reading. Using the Glass Analysis method, the authors taught TM to read and he became an independent reader. We discuss how his phonological processing deficits contributed to his reading difficulties. PMID:18729066

  16. Data independent acquisition-digital archiving mass spectrometry: application to single kernel mycotoxin analysis of Fusarium graminearum infected maize.

    PubMed

    Renaud, Justin B; Sumarah, Mark W

    2016-05-01

    New and conjugated mycotoxins of concern to regulators are frequently being identified, necessitating the costly need for new method development and sample reanalysis. In response, we developed an LC-data independent acquisition (LC-DIA) method on a Q-Exactive Orbitrap mass spectrometer tailored for mycotoxins analysis. This method combines absolute quantification of targeted fungal metabolites with non-targeted digital archiving (DA) of data on all ionizable compounds for retrospective analysis. The quantitative power of this approach was assessed by spiking 23 mycotoxins at a range of concentrations into clean maize extracts. The linearity and limits of detection achieved were comparable to conventional LC-MS/MS and significantly better than 'all-ion-fragmentation' scanning mode. This method was applied to single kernel analysis of Fusarium infected maize, where we quantified nine Fusarium metabolites and three metabolites from unexpected contaminations by Alternaria and Penicillium species. Retrospective analysis of this data set allowed us to detect the recently reported 15-acetyldeoxynivalenol-3-O-β-D-glucoside without requiring re-analysis of the samples. To our knowledge, this is the first reported occurrence of this conjugated mycotoxin in naturally contaminated maize, and led us to further study maize artificially inoculated with the 3-acetyldeoxynivalenol and 15-acetyldeoxynivalenol chemotypes of Fusarium graminearum. Analysis of these samples showed that the maize genotype tested glycosylates 15-acetyldeoxynivalenol but not 3-acetyldeoxynivalenol likely because the glycosylation site was blocked. In addition to confirming that these two F. graminearum chemotypes behave differently when infecting the host plant, it demonstrates the utility of using a single screening method to quantify known mycotoxins and archive a completely non-targeted dataset for future analysis. PMID:26886743

  17. An Experimental Analysis of Memory Processing

    ERIC Educational Resources Information Center

    Wright, Anthony A.

    2007-01-01

    Rhesus monkeys were trained and tested in visual and auditory list-memory tasks with sequences of four travel pictures or four natural/environmental sounds followed by single test items. Acquisitions of the visual list-memory task are presented. Visual recency (last item) memory diminished with retention delay, and primacy (first item) memory…

  18. Physical Data Acquisition and Analysis of Possible Flying Extraterrestrial Probes by using Opto-Electronic Devices

    NASA Astrophysics Data System (ADS)

    Teodorani, M.

    2000-02-01

    A technical research project regarding the search for evidence of the extraterrestrial origin of UFO phenomena is proposed. After showing the main results from the analysis of an earlier Norwegian instrumental project, specific monitoring techniques and strategies based on magnetometers, radio spectrum analyzers and radar-assisted sensors for the detection and analysis of UFO optical and infrared light are presented together with calculations of exposure times for optical observations. Physical parameters which are expected to be determinable from subsequent data analysis are described in detail. Finally, crucial tests in order to prove or confute a non-natural origin of the UFO phenomenon are proposed and discussed.

  19. Automated analysis for lifecycle assembly processes

    SciTech Connect

    Calton, T.L.; Brown, R.G.; Peters, R.R.

    1998-05-01

    Many manufacturing companies today expend more effort on upgrade and disposal projects than on clean-slate design, and this trend is expected to become more prevalent in coming years. However, commercial CAD tools are better suited to initial product design than to the product`s full life cycle. Computer-aided analysis, optimization, and visualization of life cycle assembly processes based on the product CAD data can help ensure accuracy and reduce effort expended in planning these processes for existing products, as well as provide design-for-lifecycle analysis for new designs. To be effective, computer aided assembly planning systems must allow users to express the plan selection criteria that apply to their companies and products as well as to the life cycles of their products. Designing products for easy assembly and disassembly during its entire life cycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and constraints (compared to initial assembly) require one to re-visit the significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or applied studies of life cycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for; optimize, and analyze life cycle assembly processes.

  20. A pathway analysis of global aerosol processes

    NASA Astrophysics Data System (ADS)

    Schutgens, Nick; Stier, Philip

    2014-05-01

    smaller modes. Our analysis also suggests that coagulation serves mainly as a loss process for number densities and that it is a relatively unimportant contributor to composition changes of aerosol. Our results provide an objective way of complexity analysis in a global aerosol model and will be used in future work where we will reduce this complexity in ECHAM-HAM.

  1. Iodine-filter-based mobile Doppler lidar to make continuous and full-azimuth-scanned wind measurements: data acquisition and analysis system, data retrieval methods, and error analysis.

    PubMed

    Wang, Zhangjun; Liu, Zhishen; Liu, Liping; Wu, Songhua; Liu, Bingyi; Li, Zhigang; Chu, Xinzhao

    2010-12-20

    An incoherent Doppler wind lidar based on iodine edge filters has been developed at the Ocean University of China for remote measurements of atmospheric wind fields. The lidar is compact enough to fit in a minivan for mobile deployment. With its sophisticated and user-friendly data acquisition and analysis system (DAAS), this lidar has made a variety of line-of-sight (LOS) wind measurements in different operational modes. Through carefully developed data retrieval procedures, various wind products are provided by the lidar, including wind profile, LOS wind velocities in plan position indicator (PPI) and range height indicator (RHI) modes, and sea surface wind. Data are processed and displayed in real time, and continuous wind measurements have been demonstrated for as many as 16 days. Full-azimuth-scanned wind measurements in PPI mode and full-elevation-scanned wind measurements in RHI mode have been achieved with this lidar. The detection range of LOS wind velocity PPI and RHI reaches 8-10 km at night and 6-8 km during daytime with range resolution of 10 m and temporal resolution of 3 min. In this paper, we introduce the DAAS architecture and describe the data retrieval methods for various operation modes. We present the measurement procedures and results of LOS wind velocities in PPI and RHI scans along with wind profiles obtained by Doppler beam swing. The sea surface wind measured for the sailing competition during the 2008 Beijing Olympics is also presented. The precision and accuracy of wind measurements are estimated through analysis of the random errors associated with photon noise and the systematic errors introduced by the assumptions made in data retrieval. The three assumptions of horizontal homogeneity of atmosphere, close-to-zero vertical wind, and uniform sensitivity are made in order to experimentally determine the zero wind ratio and the measurement sensitivity, which are important factors in LOS wind retrieval. Deviations may occur under certain

  2. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  3. A high speed data acquisition system for the analysis of velocity, density, and total temperature fluctuations at transonic speeds

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.; Jones, Gregory S.; Stainback, P. Calvin

    1988-01-01

    The use of a high-speed Dynamic Data Acquisition System (DDAS) to measure simultaneously velocity, density, and total temperature fluctuations is described. The DDAS is used to automate the acquisition of hot-wire calibration data. The data acquisition, data handling, and data reporting techiques used by DDAS are described. Sample data are used to compare results obtained with the DDAS with those obtained from the FM tape and post-test digitization method.

  4. Human movement analysis with image processing in real time

    NASA Astrophysics Data System (ADS)

    Fauvet, Eric; Paindavoine, Michel; Cannard, F.

    1991-04-01

    In the field of the human sciences, a lot of applications needs to know the kinematic characteristics of the human movements Psycology is associating the characteristics with the control mechanism, sport and biomechariics are associating them with the performance of the sportman or of the patient. So the trainers or the doctors can correct the gesture of the subject to obtain a better performance if he knows the motion properties. Roherton's studies show the children motion evolution2 . Several investigations methods are able to measure the human movement But now most of the studies are based on image processing. Often the systems are working at the T.V. standard (50 frame per secund ). they permit only to study very slow gesture. A human operator analyses the digitizing sequence of the film manually giving a very expensive, especially long and unprecise operation. On these different grounds many human movement analysis systems were implemented. They consist of: - markers which are fixed to the anatomical interesting points on the subject in motion, - Image compression which is the art to coding picture data. Generally the compression Is limited to the centroid coordinates calculation tor each marker. These systems differ from one other in image acquisition and markers detection.

  5. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  6. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    SciTech Connect

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  7. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    SciTech Connect

    Y. SUN

    2004-09-29

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P&CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P&CE (BSC 2004 [DIRS 169860

  8. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  9. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  10. Interference Analysis Process in Military Aircraft Industry

    NASA Astrophysics Data System (ADS)

    Rothenhaeusler, M.; Poisel, W.

    2012-05-01

    As flying platforms do have limited space for integration and increasing demands for antennas, interference and EMC analysis becomes ever more relevant for optimised antenna concepts. Of course aerodynamic and operational aspects are still important and can not be neglected, but interference can also be a performance killer if it is not analysed in a proper way. This paper describes an interference analysis process which is based on the electrical data of all transmitters and receivers, in- and out-of-band numerical simulation of the decoupling values of all involved antennas and includes EMC relevant data of conducted and radiated emissions, based on EMC standards like MIL-STD-461. Additionally hardware based interference cancellation is also taken into account as the last opportunity for the antenna engineer to reach the required decoupling for undisturbed communication.

  11. Quantitative analysis of geomorphic processes using satellite image data at different scales

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  12. Mathematical Analysis and Optimization of Infiltration Processes

    NASA Technical Reports Server (NTRS)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  13. Methodological Reflections on Gesture Analysis in Second Language Acquisition and Bilingualism Research

    ERIC Educational Resources Information Center

    Gullberg, Marianne

    2010-01-01

    Gestures, i.e. the symbolic movements that speakers perform while they speak, form a closely interconnected system with speech, where gestures serve both addressee-directed ("communicative") and speaker-directed ("internal") functions. This article aims (1) to show that a combined analysis of gesture and speech offers new ways to address…

  14. Acquisition strategies

    SciTech Connect

    Zimmer, M.J.; Lynch, P.W. )

    1993-11-01

    Acquiring projects takes careful planning, research and consideration. Picking the right opportunities and avoiding the pitfalls will lead to a more valuable portfolio. This article describes the steps to take in evaluating an acquisition and what items need to be considered in an evaluation.

  15. Auxetic polyurethane foam: Manufacturing and processing analysis

    NASA Astrophysics Data System (ADS)

    Jahan, Md Deloyer

    experimental design approach to identify significant processing parameters followed by optimization of those processing parameters in fabrication of auxetic PU foam. A split-plot factorial design has been selected for screening purpose. Response Surface Methodology (RSM) has been utilized to optimize the processing parameters in fabrication of auxetic PU foam. Two different designs named Box-Behnken and I-optimal designs have been employed for this analysis. The results obtained by those designs exhibit that I-optimal design provides more accurate and realistic results than Box-Behnken design when experiments are performed in split-plot manner. Finally, a near stationary ridge system is obtained by optimization analysis. As a result a set of operating conditions are obtained that produces similar minimum Poisson's ratio in auxetic PU foam.

  16. New field programmable gate array-based image-oriented acquisition and real-time processing applied to plasma facing component thermal monitoring

    SciTech Connect

    Martin, V.; Dunand, G.; Moncada, V.; Jouve, M.; Travere, J.-M.

    2010-10-15

    During operation of present fusion devices, the plasma facing components (PFCs) are exposed to high heat fluxes. Understanding and preventing overheating of these components during long pulse discharges is a crucial safety issue for future devices like ITER. Infrared digital cameras interfaced with complex optical systems have become a routine diagnostic to measure surface temperatures in many magnetic fusion devices. Due to the complexity of the observed scenes and the large amount of data produced, the use of high computational performance hardware for real-time image processing is then mandatory to avoid PFC damages. At Tore Supra, we have recently made a major upgrade of our real-time infrared image acquisition and processing board by the use of a new field programmable gate array (FPGA) optimized for image processing. This paper describes the new possibilities offered by this board in terms of image calibration and image interpretation (abnormal thermal events detection) compared to the previous system.

  17. Practical analysis of welding processes using finite element analysis.

    SciTech Connect

    Cowles, J. H.; Dave, V. R.; Hartman, D. A.

    2001-01-01

    With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the

  18. The design and instrumentation of the Purdue annular cascade facility with initial data acquisition and analysis

    NASA Technical Reports Server (NTRS)

    Stauter, R. C.; Fleeter, S.

    1982-01-01

    Three dimensional aerodynamic data, required to validate and/or indicate necessary refinements to inviscid and viscous analyses of the flow through turbomachine blade rows, are discussed. Instrumentation and capabilities for pressure measurement, probe insertion and traversing, and flow visualization are reviewed. Advanced measurement techniques including Laser Doppler Anemometers, are considered. Data processing is reviewed. Predictions were correlated with the experimental data. A flow visualization technique using helium filled soap bubbles was demonstrated.

  19. Comparative analysis of two systems for unobtrusive heart signal acquisition and characterization.

    PubMed

    Postolache, Octavian A; Girão, Pedro S; Postolache, Gabriela

    2013-01-01

    In this paper we describe and compared method of heart rate estimation from cardiac signal acquired with EMFIT, FMCW Doppler radar and Finapres based technology, in the same context, and briefly investigated their similarities and differences. Study of processing of acquired cardiac signal for accurate peak detection using Wavelet Transform is also described. The results suggest good reliability of the two implemented unobtrusive systems for heart rate estimation. PMID:24111361

  20. CT acquisition technique and quantitative analysis of the lung parenchyma: variability and corrections

    NASA Astrophysics Data System (ADS)

    Zheng, Bin; Leader, J. K.; Coxson, Harvey O.; Scuirba, Frank C.; Fuhrman, Carl R.; Balkan, Arzu; Weissfeld, Joel L.; Maitz, Glenn S.; Gur, David

    2006-03-01

    The fraction of lung voxels below a pixel value "cut-off" has been correlated with pathologic estimates of emphysema. We performed a "standard" quantitative CT (QCT) lung analysis using a -950 HU cut-off to determine the volume fraction of emphysema (below the cut-off) and a "corrected" QCT analysis after removing small group (5 and 10 pixels) of connected pixels ("blobs") below the cut-off. CT examinations two dataset of 15 subjects each with a range of visible emphysema and pulmonary obstruction were acquired at "low-dose and conventional dose reconstructed using a high-spatial frequency kernel at 2.5 mm section thickness for the same subject. The "blob" size (i.e., connected-pixels) removed was inversely related to the computed fraction of emphysema. The slopes of emphysema fraction versus blob size were 0.013, 0.009, and 0.005 for subjects with both no emphysema and no pulmonary obstruction, moderate emphysema and pulmonary obstruction, and severe emphysema and severe pulmonary obstruction, respectively. The slopes of emphysema fraction versus blob size were 0.008 and 0.006 for low-dose and conventional CT examinations, respectively. The small blobs of pixels removed are most likely CT image artifacts and do not represent actual emphysema. The magnitude of the blob correction was appropriately associated with COPD severity. The blob correction appears to be applicable to QCT analysis in low-dose and conventional CT exams.