Science.gov

Sample records for acquisition computer triggered

  1. D0 experiment: its trigger, data acquisition, and computers

    SciTech Connect

    Cutts, D.; Zeller, R.; Schamberger, D.; Van Berg, R.

    1984-05-01

    The new collider facility to be built at Fermilab's Tevatron-I D0 region is described. The data acquisition requirements are discussed, as well as the hardware and software triggers designed to meet these needs. An array of MicroVAX computers running VAXELN will filter in parallel (a complete event in each microcomputer) and transmit accepted events via Ethernet to a host. This system, together with its subsequent offline needs, is briefly presented.

  2. Triggering and data acquisition aspects of SSC tracking

    SciTech Connect

    Hanson, G.G.; Niczyporuk, B.B.; Palounek, A.P.T.

    1989-04-01

    Possible conceptual designs for wire chamber tracking systems which meet the requirements for radiation damage and rates in the SSC environment are discussed. Computer simulation studies of tracking in such systems are presented. Results of some preliminary pattern recognition studies are given. Implications for data acquisition and triggering are examined. 15 refs., 14 figs., 3 tabs.

  3. Session summary: Electronics, triggering and data acquisition

    SciTech Connect

    Rescia, S.

    1991-12-01

    The session focused on the requirements for calorimetry at the SSC/LHC. Results on new readout techniques, calibration, radiation hard electronics and semiconductor devices, analog and digital front and electronics, and trigger strategies are presented.

  4. Actively triggered 4d cone-beam CT acquisition

    SciTech Connect

    Fast, Martin F.; Wisotzky, Eric; Oelfke, Uwe; Nill, Simeon

    2013-09-15

    Purpose: 4d cone-beam computed tomography (CBCT) scans are usually reconstructed by extracting the motion information from the 2d projections or an external surrogate signal, and binning the individual projections into multiple respiratory phases. In this “after-the-fact” binning approach, however, projections are unevenly distributed over respiratory phases resulting in inefficient utilization of imaging dose. To avoid excess dose in certain respiratory phases, and poor image quality due to a lack of projections in others, the authors have developed a novel 4d CBCT acquisition framework which actively triggers 2d projections based on the forward-predicted position of the tumor.Methods: The forward-prediction of the tumor position was independently established using either (i) an electromagnetic (EM) tracking system based on implanted EM-transponders which act as a surrogate for the tumor position, or (ii) an external motion sensor measuring the chest-wall displacement and correlating this external motion to the phase-shifted diaphragm motion derived from the acquired images. In order to avoid EM-induced artifacts in the imaging detector, the authors devised a simple but effective “Faraday” shielding cage. The authors demonstrated the feasibility of their acquisition strategy by scanning an anthropomorphic lung phantom moving on 1d or 2d sinusoidal trajectories.Results: With both tumor position devices, the authors were able to acquire 4d CBCTs free of motion blurring. For scans based on the EM tracking system, reconstruction artifacts stemming from the presence of the EM-array and the EM-transponders were greatly reduced using newly developed correction algorithms. By tuning the imaging frequency independently for each respiratory phase prior to acquisition, it was possible to harmonize the number of projections over respiratory phases. Depending on the breathing period (3.5 or 5 s) and the gantry rotation time (4 or 5 min), between ∼90 and 145

  5. Low-power triggered data acquisition system and method

    NASA Technical Reports Server (NTRS)

    Champaigne, Kevin (Inventor); Sumners, Jonathan (Inventor)

    2012-01-01

    A low-power triggered data acquisition system and method utilizes low-powered circuitry, comparators, and digital logic incorporated into a miniaturized device interfaced with self-generating transducer sensor inputs to detect, identify and assess impact and damage to surfaces and structures wherein, upon the occurrence of a triggering event that produces a signal greater than a set threshold changes the comparator output and causes the system to acquire and store digital data representative of the incoming waveform on at least one triggered channel. The sensors may be disposed in an array to provide triangulation and location of the impact.

  6. The ATLAS Data Acquisition and High Level Trigger system

    NASA Astrophysics Data System (ADS)

    The ATLAS TDAQ Collaboration

    2016-06-01

    This paper describes the data acquisition and high level trigger system of the ATLAS experiment at the Large Hadron Collider at CERN, as deployed during Run 1. Data flow as well as control, configuration and monitoring aspects are addressed. An overview of the functionality of the system and of its performance is presented and design choices are discussed.

  7. The LUX experiment - trigger and data acquisition systems

    NASA Astrophysics Data System (ADS)

    Druszkiewicz, Eryk

    2013-04-01

    The Large Underground Xenon (LUX) detector is a two-phase xenon time projection chamber designed to detect interactions of dark matter particles with the xenon nuclei. Signals from the detector PMTs are processed by custom-built analog electronics which provide properly shaped signals for the trigger and data acquisition (DAQ) systems. During calibrations, both systems must be able to handle high rates and have large dynamic ranges; during dark matter searches, maximum sensitivity requires low thresholds. The trigger system uses eight-channel 64-MHz digitizers (DDC-8) connected to a Trigger Builder (TB). The FPGA cores on the digitizers perform real-time pulse identification (discriminating between S1 and S2-like signals) and event localization. The TB uses hit patterns, hit maps, and maximum response detection to make trigger decisions, which are reached within few microseconds after the occurrence of an event of interest. The DAQ system is comprised of commercial digitizers with customized firmware. Its real-time baseline suppression allows for a maximum event acquisition rate in excess of 1.5 kHz, which results in virtually no deadtime. The performance of the trigger and DAQ systems during the commissioning runs of LUX will be discussed.

  8. Integration of the Trigger and Data Acquisition Systems in ATLAS

    SciTech Connect

    Abolins, M.; Adragna, P.; Aleksandrov, E.; Aleksandrov, I.; Amorim, A.; Anderson, K.; Anduaga, X.; Aracena, I.; Asquith, L.; Avolio, G.; Backlund, S.; Badescu, E.; Baines, J.; Barria, P.; Bartoldus, R.; Batreanu, S.; Beck, H.P.; Bee, C.; Bell, P.; Bell, W.H.; Bellomo, M.; /more authors..

    2011-11-09

    During 2006 and the first half of 2007, the installation, integration and commissioning of trigger and data acquisition (TDAQ) equipment in the ATLAS experimental area have progressed. There have been a series of technical runs using the final components of the system already installed in the experimental area. Various tests have been run including ones where level 1 preselected simulated proton-proton events have been processed in a loop mode through the trigger and dataflow chains. The system included the readout buffers containing the events, event building, level 2 and event filter trigger algorithms. The scalability of the system with respect to the number of event building nodes used has been studied and quantities critical for the final system, such as trigger rates and event processing times, have been measured using different trigger algorithms as well as different TDAQ components. This paper presents the TDAQ architecture, the current status of the installation and commissioning and highlights the main test results that validate the system.

  9. Evolution of the ATLAS Trigger and Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Pozo Astigarraga, M. E.; ATLAS Collaboration

    2015-05-01

    ATLAS is a Physics experiment that explores high-energy particle collisions at the Large Hadron Collider at CERN. It uses tens of millions of electronics channels to capture the outcome of the particle bunches crossing each other every 25 ns. Since reading out and storing the complete information is not feasible (˜100 TB/s), ATLAS makes use of a complex and highly distributed Trigger and Data Acquisition (TDAQ) system, in charge of selecting only interesting data and transporting those to permanent mass storage (˜1 GB/s) for later analysis. The data reduction is carried out in two stages: first, custom electronics performs an initial level of data rejection for each bunch crossing based on partial and localized information. Only data corresponding to collisions passing this stage of selection will be actually read-out from the on-detector electronics. Then, a large computer farm (˜17 k cores) analyses these data in real-time and decides which ones are worth being stored for Physics analysis. A large network allows moving the data from ˜2000 front-end buffers to the location where they are processed and from there to mass storage. The overall TDAQ system is embedded in a common software framework that allows controlling, configuring and monitoring the data taking process. The experience gained during the first period of data taking of the ATLAS experiment (Run I, 2010-2012) has inspired a number of ideas for improvement of the TDAQ system that are being put in place during the so-called Long Shutdown 1 of the Large Hadron Collider (LHC), in 2013/14. This paper summarizes the main changes that have been applied to the ATLAS TDAQ system and highlights the expected performance and functional improvements that will be available for the LHC Run II. Particular emphasis will be put on the evolution of the software-based data selection and of the flow of data in the system. The reasons for the modified architectural and technical choices will be explained, and details

  10. Proceedings of the workshop on triggering and data acquisition for experiments at the Supercollider

    SciTech Connect

    Donaldson, R.

    1989-04-01

    This meeting covered the following subjects: triggering requirements for SSC physics; CDF level 3 trigger; D0 trigger design; AMY trigger systems; Zeus calorimeter first level trigger; data acquisition for the Zeus Central Tracking Detector; trigger and data acquisition aspects for SSC tracking; data acquisition systems for the SSC; validating triggers in CDF level 3; optical data transmission at SSC; time measurement system at SSC; SSC/BCD data acquisition system; microprocessors and other processors for triggering and filtering at the SSC; data acquisition, event building, and on-line processing; LAA real-time benchmarks; object-oriented system building at SSC; and software and project management. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  11. Checkpoint triggering in a computer system

    DOEpatents

    Cher, Chen-Yong

    2016-09-06

    According to an aspect, a method for triggering creation of a checkpoint in a computer system includes executing a task in a processing node of the computer system and determining whether it is time to read a monitor associated with a metric of the task. The monitor is read to determine a value of the metric based on determining that it is time to read the monitor. A threshold for triggering creation of the checkpoint is determined based on the value of the metric. Based on determining that the value of the metric has crossed the threshold, the checkpoint including state data of the task is created to enable restarting execution of the task upon a restart operation.

  12. The trigger and data acquisition for the NEMO-Phase 2 tower

    SciTech Connect

    Pellegrino, C.; Biagi, S.; Fusco, L. A.; Margiotta, A.; Spurio, M.; Chiarusi, T.; and others

    2014-11-18

    In the framework of the Phase 2 of the NEMO neutrino telescope project, a tower with 32 optical modules is being operated since march 2013. A new scalable Trigger and Data Acquisition System (TriDAS) has been developed and extensively tested with the data from this tower. Adopting the all-data-to-shore concept, the NEMO TriDAS is optimized to deal with a continuous data-stream from off-shore to on-shore with a large bandwidth. The TriDAS consists of four computing layers: (i) data aggregation of isochronal hits from all optical modules; (ii) data filtering by means of concurrent trigger algorithms; (iii) composition of the filtered events into post-trigger files; (iv) persistent data storage. The TriDAS implementation is reported together with a review of dedicated on-line monitoring tools.

  13. The upgraded HADES trigger and data acquisition system

    NASA Astrophysics Data System (ADS)

    Michel, J.; Böhmer, M.; Kajetanowicz, M.; Korcyl, G.; Maier, L.; Palka, M.; Stroth, J.; Tarantola, A.; Traxler, M.; Ugur, C.; Yurevich, S.

    2011-12-01

    The HADES experiment is a High Acceptance Di-Electron Spectrometer located at GSI in Darmstadt, Germany. Recently, its trigger and data acquisition system was upgraded. The main goal was to substantially increase the event rate capability by a factor of up to 20 to reach 100 kHz in light and 20 kHz in heavy ion reaction systems. The total data rate written to storage is about 400 MByte/s in peak. In this context, the complete read-out system was exchanged to FPGA-based platforms using optical communication. For data transport a general-purpose real-time network protocol was developed to meet the strong requirements of the system. In particular, trigger information has to reach all front-end modules with latencies of less than 5 μs through up to 10 intermediate hubs in a star-like network setup. Monitoring and slow control features as well as readout and trigger distribution were joined in a single network protocol made up by three virtual channels with inherent arbitration by priority and a typical switching time of 100 ns. The full DAQ system includes about 550 FPGAs distributed over the complete detector system. For control and monitoring a virtual address space spanning the whole network is provided. Data are merged by the network hubs into data streams and passed on to a server farm using an Ethernet infrastructure. Due to the electromagnetic noise environment, several transmission error detection and correction features were included. In collaboration with groups from experiments of the FAIR accelerator complex, further developments based on the versatile hardware and communication protocol are being pursued.

  14. The LHCb Data Acquisition and High Level Trigger Processing Architecture

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaspar, C.; Jost, B.; Neufeld, N.

    2015-12-01

    The LHCb experiment at the LHC accelerator at CERN collects collisions of particle bunches at 40 MHz. After a first level of hardware trigger with an output rate of 1 MHz, the physically interesting collisions are selected by running dedicated trigger algorithms in the High Level Trigger (HLT) computing farm. This farm consists of up to roughly 25000 CPU cores in roughly 1750 physical nodes each equipped with up to 4 TB local storage space. This work describes the LHCb online system with an emphasis on the developments implemented during the current long shutdown (LS1). We will elaborate the architecture to treble the available CPU power of the HLT farm and the technicalities to determine and verify precise calibration and alignment constants which are fed to the HLT event selection procedure. We will describe how the constants are fed into a two stage HLT event selection facility using extensively the local disk buffering capabilities on the worker nodes. With the installed disk buffers, the CPU resources can be used during periods of up to ten days without beams. These periods in the past accounted to more than 70% of the total time.

  15. Tracker Readout ASIC for Proton Computed Tomography Data Acquisition.

    PubMed

    Johnson, Robert P; Dewitt, Joel; Holcomb, Cole; Macafee, Scott; Sadrozinski, Hartmut F-W; Steinberg, David

    2013-10-01

    A unique CMOS chip has been designed to serve as the front-end of the tracking detector data acquisition system of a pre-clinical prototype scanner for proton computed tomography (pCT). The scanner is to be capable of measuring one to two million proton tracks per second, so the chip must be able to digitize the data and send it out rapidly while keeping the front-end amplifiers active at all times. One chip handles 64 consecutive channels, including logic for control, calibration, triggering, buffering, and zero suppression. It outputs a formatted cluster list for each trigger, and a set of field programmable gate arrays merges those lists from many chips to build the events to be sent to the data acquisition computer. The chip design has been fabricated, and subsequent tests have demonstrated that it meets all of its performance requirements, including excellent low-noise performance. PMID:24653525

  16. Tracker Readout ASIC for Proton Computed Tomography Data Acquisition

    PubMed Central

    Johnson, Robert P.; DeWitt, Joel; Holcomb, Cole; Macafee, Scott; Sadrozinski, Hartmut F.-W.; Steinberg, David

    2014-01-01

    A unique CMOS chip has been designed to serve as the front-end of the tracking detector data acquisition system of a pre-clinical prototype scanner for proton computed tomography (pCT). The scanner is to be capable of measuring one to two million proton tracks per second, so the chip must be able to digitize the data and send it out rapidly while keeping the front-end amplifiers active at all times. One chip handles 64 consecutive channels, including logic for control, calibration, triggering, buffering, and zero suppression. It outputs a formatted cluster list for each trigger, and a set of field programmable gate arrays merges those lists from many chips to build the events to be sent to the data acquisition computer. The chip design has been fabricated, and subsequent tests have demonstrated that it meets all of its performance requirements, including excellent low-noise performance. PMID:24653525

  17. Event triggered data acquisition in the Rock Mechanics Laboratory

    SciTech Connect

    Hardy, R.D.

    1993-03-01

    Increasing complexity of experiments coupled with limitations of the previously used computers required improvements in both hardware and software in the Rock Mechanics Laboratories. Increasing numbers of input channels and the need for better graphics could no longer be supplied by DATAVG, an existing software package for data acquisition and display written by D. J. Holcomb in 1983. After researching the market and trying several alternatives, no commercial program was found which met our needs. The previous version of DATAVG had the basic features needed but was tied to obsolete hardware. Memory limitations on the previously used PDP-11 made it impractical to upgrade the software further. With the advances in IBM compatible computers it is now desirable to use them as data recording platforms. With this information in mind, it was decided to write a new version of DATAVG which would take advantage of newer hardware. The new version had to support multiple graphic display windows and increased channel counts. It also had to be easier to use.

  18. The BTeV trigger and data acquisition system

    SciTech Connect

    Butler, Joel N.; /Fermilab

    2004-10-01

    The BTeV trigger inspects every beam crossing of the Fermilab Tevatron, running at a luminosity of 2 x 10{sup 32}/cm{sup 2}-s, and selects events that have ''detached vertices'' from B decays occurring downstream of the main interaction. The system uses a massively parallel system of FPGAs and microprocessors to produce a trigger decision on average every 396 ns. The trigger calculations are facilitated by the 23 Million channel pixel detector that provides the input to the trigger. Front end electronics sparsifies the remainder of event data and sends it to large, Tbyte, memory buffers that store it until the trigger decision can be made. This complex system presents special challenges in fault monitoring and power and cooling.

  19. Personal computer process data acquisition

    SciTech Connect

    Dworjanyn, L.O. )

    1989-01-01

    A simple Basic program was written to permit personal computer data collection of process temperatures, pressures, flows, and inline analyzer outputs for a batch-type, unit operation. The field voltage outputs were read on a IEEE programmable digital multimeter using a programmable scanner to select different output lines. The data were stored in ASCII format to allow direct analysis by spreadsheet programs. 1 fig., 1 tab.

  20. Early Spelling Acquisition: Writing Beats the Computer.

    ERIC Educational Resources Information Center

    Cunningham, Anne E.; Stanovich, Keith E.

    1990-01-01

    The spelling performance of 22 male and 26 female first graders using a simultaneous oral spelling method was compared when students were writing the words, spelling them with letter tiles, or typing them on a computer keyboard. Results of two experiments indicate the superiority of handwriting in improving spelling acquisition. (SLD)

  1. Workshop on data acquisition and trigger system simulations for high energy physics

    SciTech Connect

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- an Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.

  2. The Trigger and Data Acquisition System for the KM3NeT neutrino telescope

    NASA Astrophysics Data System (ADS)

    Pellegrino, Carmelo; Chiarusi, Tommaso

    2016-04-01

    KM3NeT is a large research infrastructure in the Mediterranean Sea that includes a network of deep-sea neutrino telescopes. The telescopes consist of vertical detection units carrying optical modules, whose separation is optimised according to the different ranges of neutrino energy that shall be explored. Two building blocks, each one made of 115 detection units, will be deployed at the KM3NeT-IT site, about 80 km from Capo Passero, Italy, to search for high-energy neutrino sources (ARCA); another building block will be installed at the KM3NeT-Fr site, about 40 km from Toulon, France, to study the hierarchy of neutrino masses (ORCA). The modular design of the KM3NeT allows for a progressive implementation and data taking even with an incomplete detector. The same scalable design is used for the Trigger and Data Acquisition Systems (TriDAS). In order to reduce the complexity of the hardware inside the optical modules, the "all data to shore" concept is adopted. This implies that the throughput is dominated by the optical background due to the decay of 40K dissolved in the sea water and to the bursts of bioluminescence, about 3 orders of magnitude larger than the physics signal, ranging from 20 Gbps to several hundreds Gbps, according to the number of detection units. In addition, information from the acoustic positioning system of the detection units must be transmitted. As a consequence of the detector construction, the on-shore DAQ infrastructure must be expanded to handle an increasing data-rate and implement an efficient fast data filtering for both the optical and acoustic channels. In this contribution, the Trigger and Data Acquisition System designed for the Phase 1 of KM3NeT and its future expansion are presented. The network infrastructure, the shore computing resources and the developed applications for handling, filtering and monitoring the optical and acoustic data-streams are described.

  3. Predictable Vertical Targets Acquisition - The Eye-Head Coordination and the Triggering Effect

    NASA Technical Reports Server (NTRS)

    Kolev, Ognyan I.; Reschke, Millard F.

    2016-01-01

    The current study was designed to investigate target acquisition in the vertical plane with emphasis on establishing strategy differences associated with acquisition triggering methods. Eight subjects were tested. Measurements consisted of target acquisition time, eye-head latency differences, velocity of gaze, eyes and head, and head amplitude. Using three-way repeated measures ANOVA the results show that the strategy for acquisition of predictable visual targets in vertical plane with the head unrestrained significantly depended on: (i) the direction of the gaze motion with respect to the gravity vector (i.e. there is significant up-down asymmetry); (ii) the angular distance of the target and (iii) the method of triggering the command to acquire the target - external versus internal. The data also show that when vertical acquisition is compared with triggering methods in the horizontal plane there is a difference in overall strategy for the acquisition of targets with the same spatial distances from straight ahead gaze when both the eyes and head are used. Among the factors contributing to the difference in strategy for vertical target acquisition are: the gravitational vector, the relationship of target displacement and vestibular activation, biomechanical and neural control, asymmetries and the difference in the vertical field of view.

  4. Data acquisition, triggering, and filtering at the Auger Engineering Radio Array

    NASA Astrophysics Data System (ADS)

    Kelley, J. L.

    2013-10-01

    The Auger Engineering Radio Array (AERA) is currently detecting cosmic rays of energies at and above 1017 eV at the Pierre Auger Observatory, by triggering on the radio emission produced in the associated air showers. The radio-detection technique must cope with a significant background of man-made radio-frequency interference, but can provide information on shower development with a high duty cycle. We discuss our techniques to handle the challenges of self-triggered radio detection in a low-power autonomous array, including triggering and filtering algorithms, data acquisition design, and communication systems.

  5. Computational Modeling for Language Acquisition: A Tutorial with Syntactic Islands

    ERIC Educational Resources Information Center

    Pearl, Lisa S.; Sprouse, Jon

    2015-01-01

    Purpose: Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. Method: We provide a general…

  6. Implementation of local area network extension for instrumentation standard trigger capabilities in advanced data acquisition platformsa)

    NASA Astrophysics Data System (ADS)

    López, J. M.; Ruiz, M.; Barrera, E.; de Arcas, G.; Vega, J.

    2008-10-01

    Synchronization mechanisms are an essential part of the real-time distributed data acquisition systems (DASs) used in fusion experiments. Traditionally, they have been based on the use of digital signals. The approach known as local area network extension for instrumentation (LXI) provides a set of very powerful synchronization and trigger mechanisms. The Intelligent Test Measurement System (ITMS) is a new platform designed to implement distributed data acquisition and fast data processing for fusion experiments. It is based on COMPATPCI technology and its extension to instrumentation (PXI). Hardware and software elements have been developed to include LXI trigger and synchronization mechanisms in this platform in order to obtain a class A LXI instrument. This paper describes the implementation of such a system, involving the following components: commercial hardware running a Linux operating system; a real-time extension to an operating system and network (RTAI and RTNET), which implements a software precision time protocol (PTP) using IEEE1588; an ad hoc PXI module to support hardware implementation of PTP-IEEE 1588; and the multipoint, low-voltage differential signaling hardware LXI trigger bus.

  7. Computer Systems Acquisition and the Use of Letters of Credit.

    ERIC Educational Resources Information Center

    Wernick, Alan S.

    1988-01-01

    Describes letters of credit and their usefulness in financing computer systems acquisition. Relevant legal policies and the different types and applications of letters of credit are discussed. (1 reference) (MES)

  8. Investigation of the cellular reprogramming phenomenon referred to as stimulus-triggered acquisition of pluripotency (STAP)

    PubMed Central

    Niwa, Hitoshi

    2016-01-01

    In January 2014, it was reported that strong external stimuli, such as a transient low-pH stressor, was capable of inducing the reprogramming of mammalian somatic cells, resulting in the generation of pluripotent cells. This cellular reprograming event was designated ‘stimulus-triggered acquisition of pluripotency’ (STAP) by the authors of these reports. However, after multiple instances of scientific misconduct in the handling and presentation of the data were brought to light, both reports were retracted. To investigate the actual scientific significance of the purported STAP phenomenon, we sought to repeat the original experiments based on the methods presented in the retracted manuscripts and other relevant information. As a result, we have concluded that the STAP phenomenon as described in the original studies is not reproducible. PMID:27292224

  9. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  10. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  11. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  12. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  13. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation....

  14. NOTE: Externally triggered gating of nuclear medicine acquisitions: a useful method for partitioning data

    NASA Astrophysics Data System (ADS)

    Bailey, Dale L.; Kalemis, Antonis

    2005-04-01

    Physiological gating in nuclear medicine image acquisition was introduced over 30 years ago to subdivide data from the beating heart into short time frames to minimize motion blurring and permit evaluation of contractile parameters. It has since been widely applied in planar gamma camera imaging, SPECT, positron tomography (PET) and anatomical modalities such as x-ray CT and MRI, mostly for cardiac or respiratory investigations. However, the gating capability of gamma cameras and PET scanners can be employed to produce multiply partitioned, statistically independent projection data that can be used in various ways such as to study the effect of varying total acquired counts or time, or administered radioactivity, on image quality and multiple observations for statistical image analyses. Externally triggered gating essentially provides 'something for nothing' as no data are lost and a 'non-gated' data set is easily synthesized post hoc, and there are few reasons for not acquiring the data in this manner (e.g., slightly longer processing time, extra disk space, etc). We present a number of examples where externally triggered gating and partitioning of image data has been useful.

  15. Cognitive Characteristics and Initial Acquisition of Computer Programming Competence.

    ERIC Educational Resources Information Center

    Foreman, Kim H.

    1990-01-01

    This study examined the relationship between cognitive characteristics (field-independence, spatial visualization, logical reasoning, and direction following) and the initial acquisition of computer programing competence in 29 computer programing students. Students completed tests and surveys; results suggested that individual differences be…

  16. Computational Analysis of SAXS Data Acquisition.

    PubMed

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals. PMID:26244255

  17. Computer Enhanced Problem Solving Skill Acquisition.

    ERIC Educational Resources Information Center

    Slotnick, Robert S.

    1989-01-01

    Discusses the implementation of interactive educational software that was designed to enhance critical thinking, scientific reasoning, and problem solving in a university psychology course. Piagetian and computer learning perspectives are explained; the courseware package, PsychWare, is described; and the use of heuristics and algorithms in…

  18. Computer Acquisition: The Carnegie-Mellon Strategy.

    ERIC Educational Resources Information Center

    McCredie, John W.

    1979-01-01

    Describes the strategy used at Carnegie-Mellon University for providing adequate services for the increasing demands of campus computing. This strategy combines the flexibility of leasing with the economy of outright purchase by acquiring smaller systems at frequent intervals and distributing processing. (CMV)

  19. High Speed Multichannel Charge Sensitive Data Acquisition System with Self-Triggered Event Timing

    PubMed Central

    Tremsin, Anton S.; Siegmund, Oswald H.W.; Vallerga, John V.; Raffanti, Rick; Weiss, Shimon; Michalet, Xavier

    2010-01-01

    A number of modern experiments require simultaneous measurement of charges on multiple channels at > MHz event rates with an accuracy of 100-1000 e− rms. One widely used data processing scheme relies on application of specific integrated circuits enabling multichannel analog peak detection asserted by an external trigger followed by a serial/sparsified readout. Although this configuration minimizes the back end electronics, its counting rate capability is limited by the speed of the serial readout. Recent advances in analog to digital converters and FPGA devices enable fully parallel high speed multichannel data processing with digital peak detection enhanced by finite impulse response filtering. Not only can accurate charge values be obtained at high event rates, but the timing of the event on each channel can also be determined with high accuracy. We present the concept and first experimental tests of fully parallel 128-channel charge sensitive data processing electronics capable of measuring charges with accuracy of ~1000 e- rms. Our system does not require an external trigger and, in addition to charge values, it provides the event timing with an accuracy of ~1 ns FWHM. One of the possible applications of this system is high resolution position sensitive event counting detectors with microchannel plates combined with cross strip readout. Implementation of fast data acquisition electronics increases the counting rates of those detectors to multi-MHz level, preserving their unique capability of virtually noiseless detection of both position (with accuracy of ~10 μm FWHM) and timing (~1 ns FWHM) of individual particles, including photons, electrons, ions, neutrals, and neutrons. PMID:20174482

  20. The Trigger and Data Acquisition System for the KM3NeT-Italia towers

    NASA Astrophysics Data System (ADS)

    Favaro, M.; Chiarusi, T.; Giacomini, F.; Manzali, M.; Margiotta, A.; Pellegrino, C.

    2016-04-01

    KM3NeT-Italia is an INFN project supported with Italian PON fundings for building the core of the Italian node of the KM3NeT neutrino telescope. The detector, made of 700 10'' Optical Modules (OMs) lodged along 8 vertical structures called towers, will be deployed starting from fall 2015 at the KM3NeT-Italy site, about 80 km off Capo Passero, Italy, 3500 m deep. The all data to shore approach is used to reduce the complexity of the submarine detector, demanding for an on-line trigger integrated in the data acquisition system running in the shore station, called TriDAS. Due to the large optical background in the sea from 40K decays and bioluminescence, the throughput from the underwater detector can range up to 30 Gbps. This puts strong constraints on the design and performances of the TriDAS and of the related network infrastructure. In this contribution the technology behind the implementation of the TriDAS infrastructure is reviewed, focusing on the relationship between the various components and their performances. The modular design of the TriDAS, which allows for its scalability up to a larger detector than the 8-tower configuration is also discussed.

  1. COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS

    EPA Science Inventory

    The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...

  2. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  3. Computer Data Acquisition Applications in the Materials Science Laboratory.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described are applications of computer data acquisition to three laboratories in materials science at the United States Naval Academy. In each laboratory, data are input to a minicomputer, scaled using previously obtained and stored calibration factors to convert the transducer signals to load, displacement, temperature, etc., and then stored on…

  4. [A novel serial port auto trigger system for MOSFET dose acquisition].

    PubMed

    Luo, Guangwen; Qi, Zhenyu

    2013-01-01

    To synchronize the radiation of microSelectron-HDR (Nucletron afterloading machine) and measurement of MOSFET dose system, a trigger system based on interface circuit was designed and corresponding monitor and trigger program were developed on Qt platform. This interface and control system was tested and showed stable operate and reliable work. This adopted serial port detect technique may expand to trigger application of other medical devices. PMID:23668038

  5. Data-flow coupling and data-acquisition triggers for the PreSPEC-AGATA campaign at GSI

    NASA Astrophysics Data System (ADS)

    Ralet, D.; Pietri, S.; Aubert, Y.; Bellato, M.; Bortolato, D.; Brambilla, S.; Camera, F.; Dosme, N.; Gadea, A.; Gerl, J.; Golubev, P.; Grave, X.; Johansson, H. T.; Karkour, N.; Korichi, A.; Kurz, N.; Lafay, X.; Legay, E.; Linget, D.; Pietralla, N.; Rudolph, D.; Schaffner, H.; Stezowski, O.; Travers, B.; Wieland, O.

    2015-06-01

    The PreSPEC setup for high-resolution γ-ray spectroscopy using radioactive ion beams was employed for experimental campaigns in 2012 and 2014. The setup consisted of the state of the art Advanced GAmma Tracking Array (AGATA) and the High Energy γ deteCTOR (HECTOR+) positioned around a secondary target at the final focal plane of the GSI FRagment Separator (FRS) to perform in-beam γ-ray spectroscopy of exotic nuclei. The Lund York Cologne CAlorimeter (LYCCA) was used to identify the reaction products. In this paper we report on the trigger scheme used during the campaigns. The data-flow coupling between the Multi-Branch System (MBS) based Data AcQuisition (DAQ) used for FRS-LYCCA and the "Nouvelle Acquisition temps Réel Version 1.2 Avec Linux" (NARVAL) based acquisition system used for AGATA are also described.

  6. A personal computer-based, multitasking data acquisition system

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1990-01-01

    A multitasking, data acquisition system was written to simultaneously collect meteorological radar and telemetry data from two sources. This system is based on the personal computer architecture. Data is collected via two asynchronous serial ports and is deposited to disk. The system is written in both the C programming language and assembler. It consists of three parts: a multitasking kernel for data collection, a shell with pull down windows as user interface, and a graphics processor for editing data and creating coded messages. An explanation of both system principles and program structure is presented.

  7. The electronics, online trigger system and data acquisition system of the J-PARC E16 experiment

    NASA Astrophysics Data System (ADS)

    Takahashi, T. N.; Hamada, E.; Ikeno, M.; Kawama, D.; Morino, Y.; Nakai, W.; Obara, Y.; Ozawa, K.; Sendai, H.; Tanaka, M. M.; Uchida, T.; Yokkaichi, S.

    2015-12-01

    The J-PARC E16 experiment was proposed to investigate the restoration of chiral symmetry at the normal nuclear density. E16 will systematically measure in-medium mass of vector mesons at J-PARC Hadron Experimental Facility using a 30-GeV proton beam with an intensity of 2 x 1010 protons per pulse. The E16 spectrometer was designed to detect e+e- from slowly moving vector meson, particularly the ϕ meson. The detector system consists of GEM tracker, Gas Cerenkov detector based on GEM and electromagnetic calorimeter made of lead-glass, whose number of channels reaches about 100,000 in total. The readout electronics, trigger system and data acquisition system for the detectors have been developed, for which level-1 trigger rate of 1-2 kHz is required under several 10 MHz interaction rate. The preparation is underway for the first beam time in 2017.

  8. 48 CFR 52.223-16 - Acquisition of EPEAT-Registered Personal Computer Products.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-Registered Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition Regulations System... Text of Provisions and Clauses 52.223-16 Acquisition of EPEAT-Registered Personal Computer Products. As prescribed in 23.705(d)(1), insert the following clause: Acquisition of Epeat®-Registered Personal...

  9. Performance of the NOνA Data Acquisition and Trigger Systems for the full 14 kT Far Detector

    NASA Astrophysics Data System (ADS)

    Norman, A.; Davies, G. S.; Ding, P. F.; Dukes, E. C.; Duyan, H.; Frank, M. J.; R. C. Group; Habig, A.; Henderson, W.; Niner, E.; Mina, R.; Moren, A.; Mualem, L.; Oksuzian, Y.; Rebel, B.; Shanahan, P.; Sheshukov, A.; Tamsett, M.; Tomsen, K.; Vinton, L.; Wang, Z.; Zamorano, B.; Zirnstien, J.

    2015-12-01

    The NOvA experiment uses a continuous, free-running, dead-timeless data acquisition system to collect data from the 14 kT far detector. The DAQ system readouts the more than 344,000 detector channels and assembles the information into an raw unfiltered high bandwidth data stream. The NOvA trigger systems operate in parallel to the readout and asynchronously to the primary DAQ readout/event building chain. The data driven triggering systems for NOvA are unique in that they examine long contiguous time windows of the high resolution readout data and enable the detector to be sensitive to a wide range of physics interactions from those with fast, nanosecond scale signals up to processes with long delayed coincidences between hits which occur at the tens of milliseconds time scale. The trigger system is able to achieve a true 100% live time for the detector, making it sensitive to both beam spill related and off-spill physics.

  10. Computer simulation of low-frequency electromagnetic data acquisition

    SciTech Connect

    SanFilipo, W.A.; Hohmann, G.W.

    1982-02-01

    Computer simulation of low frequency electromagnetic (LFEM) digital data acquisition in the presence of natural field noise demonstrates several important limitations and considerations. Without the use of a remote reference noise removal scheme it is difficult to obtain an adequate ratio of signal to noise below 0.1 Hz for frequency domain processing and below 0.3 Hz base frequency for time domain processing for a typical source-receiver configuration. A digital high-pass filter substantially facilitates rejection of natural field noise above these frequencies but, at lower frequencies where much longer stacking times are required, it becomes ineffective. Use of a remote reference to subtract natural field noise extends these low-frequency limits a decade, but this technique is limited by the resolution and dynamic range of the instrumentation. Gathering data in short segments so that natural field drift can be offset for each segment allows a higher gain setting to minimize dynamic range problems.

  11. The Trigger and Data Acquisition System for the 8 tower subsystem of the KM3NeT detector

    NASA Astrophysics Data System (ADS)

    Manzali, M.; Chiarusi, T.; Favaro, M.; Giacomini, F.; Margiotta, A.; Pellegrino, C.

    2016-07-01

    KM3NeT is a deep-sea research infrastructure being constructed in the Mediterranean Sea. It will host a large Cherenkov neutrino telescope that will collect photons emitted along the path of the charged particles produced in neutrino interactions in the vicinity of the detector. The philosophy of the DAQ system of the detector foresees that all data are sent to shore after a proper sampling of the photomultiplier signals. No off-shore hardware trigger is implemented and a software selection of the data is performed with an on-line Trigger and Data Acquisition System (TriDAS) to reduce the large throughput due to the environmental light background. A first version of the TriDAS has been developed to operate a prototype detection unit deployed in March 2013 in the abyssal site of Capo Passero (Sicily, Italy), about 3500 m deep. A revised and improved version has been developed to meet the requirements of the final detector, using new tools and modern design solutions. First installation and scalability tests have been performed at the Bologna Common Infrastructure and results comparable to what expected have been observed.

  12. Compton suppression and event triggering in a commercial data acquisition system

    NASA Astrophysics Data System (ADS)

    Tabor, Samuel; Caussyn, D. D.; Tripathi, Vandana; Vonmoss, J.; Liddick, S. N.

    2012-10-01

    A number of groups are starting to use flash digitizer systems to directly convert the preamplifier signals of high-resolution Ge detectors to a stream of digital data. Some digitizers are also equipped with software constant fraction discriminator algorithms capable of operating on the resulting digital data stream to provide timing information. Because of the dropping cost per channel of these systems, it should now be possible to also connect outputs of the Bismuth Germanate (BGO) scintillators used for Compton suppression to other digitizer inputs so that BGO logic signals can also be available in the same system. This provides the possibility to perform all the Compton suppression and multiplicity trigger logic within the digital system, thus eliminating the need for separate timing filter amplifiers (TFA), constant fraction discriminators (CFD), logic units, and lots of cables. This talk will describe the performance of such a system based on Pixie16 modules from XIA LLC with custom field programmable gate array (FPGA) programming for an array of Compton suppressed single Ge crystal and 4-crystal ``Clover'' detector array along with optional particle detectors. Initial tests of the system have produced results comparable with the current traditional system of individual electronics and peak sensing analog to digital converters. The advantages of the all digital system will be discussed.

  13. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    SciTech Connect

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3) Identify the technical

  14. Age of Acquisition: Its Neural and Computational Mechanisms

    ERIC Educational Resources Information Center

    Hernandez, Arturo E.; Li, Ping

    2007-01-01

    The acquisition of new skills over a life span is a remarkable human ability. This ability, however, is constrained by age of acquisition (AoA); that is, the age at which learning occurs significantly affects the outcome. This is most clearly reflected in domains such as language, music, and athletics. This article provides a perspective on the…

  15. The application of a computer data acquisition system to a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1991-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  16. The application of a computer data acquisition system for a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1990-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  17. A High-Resolution TDC-Based Board for a Fully Digital Trigger and Data Acquisition System in the NA62 Experiment at CERN

    NASA Astrophysics Data System (ADS)

    Pedreschi, Elena; Angelucci, Bruno; Avanzini, Carlo; Galeotti, Stefano; Lamanna, Gianluca; Magazzu, Guido; Pinzino, Jacopo; Piandani, Roberto; Sozzi, Marco; Spinella, Franco; Venditti, Stefano

    2015-06-01

    A Time to Digital Converter (TDC) based system, to be used for most sub-detectors in the high-flux rare-decay experiment NA62 at CERN SPS, was built as part of the NA62 fully digital Trigger and Data AcQuisition system (TDAQ), in which the TDC Board (TDCB) and a general-purpose motherboard (TEL62) will play a fundamental role. While TDCBs, housing four High Performance Time to Digital Converters (HPTDC), measure hit times from sub-detectors, the motherboard processes and stores them in a buffer, produces trigger primitives from different detectors and extracts only data related to the lowest trigger level decision, once this is taken on the basis of the trigger primitives themselves. The features of the TDCB board developed by the Pisa NA62 group are extensively discussed and performance data is presented in order to show its compliance with the experiment requirements.

  18. The Utah Educational Technology Initiative Year Two Evaluation: Program Implementation, Computer Acquisition and Placement, and Computer Use.

    ERIC Educational Resources Information Center

    Mergendoller, John R.; And Others

    This evaluation report describes program implementation, computer acquisition and placement, and computer use during the second year (1991-92) of the Utah Educational Technology Initiative (ETI). In addition, it discusses the various ways computers are used in Utah schools and reports the opinions and experiences of ETI coordinators in the 12…

  19. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    NASA Technical Reports Server (NTRS)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  20. Computer Based Data Acquisition in the Undergraduate Lab.

    ERIC Educational Resources Information Center

    Wepfer, William J.; Oehmke, Roger L. T.

    1987-01-01

    Describes a data acquisition system developed for an undergraduate engineering students' instructional laboratory at Georgia Tech. Special emphasis is placed on the design of an A/D Converter Board used to measure the viscosity and temperature of motor oil. The Simons' BASIC Program Listing for the Commodore 64 microcomputer is appended. (LRW)

  1. Micro-Computer Video Games and Spatial Visualization Acquisition.

    ERIC Educational Resources Information Center

    Lowery, Bennie R.; Knirk, Frederick G.

    1982-01-01

    Discusses the impact and effects of many hours of interaction with computerized video games on the acquisition and development of spatial visualization skills and their relationship to mathematical and scientific aptitude. Sex differences in spatial ability and learning of spatial visualization skills are discussed, and references are listed. (EAO)

  2. Computer system design description for the spare pump mini-dacs data acquisition and control system

    SciTech Connect

    Vargo, G.F. Jr.

    1994-09-29

    The attached document outlines the computer software design for the mini data acquisition and control system (DACS), that supports the testing of the spare pump for Tank 241-SY-101, at the maintenance and storage facility (MASF).

  3. Resonator stabilization, data acquisition and computing on a laser gyro

    NASA Astrophysics Data System (ADS)

    Siol, G.

    1983-03-01

    In order to make the resonator length of the laser gyro insensitive to thermal or mechanical perturbations, a control loop, consisting of piezodrive, optical sensors, readout electronics, and phase sensitive demodulation was developed. Assembly and circuit components are described. Test results indicate excellent perturbation suppression for frequencies below 1 Hz. The acquisition system for the data of the stabilized gyro consists of a multichannel analyzer, floppy disk unit, and controller. Data processing programs in FORTRAN, BASIC and Assembler are given. The system is shown to be particularly useful for large data series and day-to-day measurements.

  4. Resonator stabilization, data acquisition and computing a laser gyro

    NASA Astrophysics Data System (ADS)

    Siol, G.

    1981-04-01

    In order to make the resonator length of the laser gyro insensitive to thermal or mechanical perturbations, a control loop, consisting of piezodrive, optical sensors, readout electronics, and phase sensitive demodulation was developed. The problems encountered, assembly, and the circuit components are described. Test results indicate excellent perturbation suppression for frequencies below 1 Hz. The acquisition system for the data of the stabilized gyro is described. It consists of a multichannel analyzer, floppy disk unit, and controller. The various programs for the data processing in FORTRAN, BASIC and Assembler are given. The system is shown to be particularly useful for large data series and day-to-day measurements.

  5. Adult Learning in a Computer-Based ESL Acquisition Program

    ERIC Educational Resources Information Center

    Sanchez, Karen Renee

    2013-01-01

    This study explores the self-efficacy of students learning English as a Second Language on the computer-based Rosetta Stone program. The research uses a qualitative approach to explore how a readily available computer-based learning program, Rosetta Stone, can help adult immigrant students gain some English competence and so acquire a greater…

  6. Rural women with chronic illness: computer use and skill acquisition.

    PubMed

    Weinert, Clarann; Hill, Wade G

    2005-01-01

    Chronically ill rural women must manage complex illness without easy access to health care resources including support and health information. The Women to Women project is a technology-based program with an overarching aim to assist rural women in the day-to-day management of their illnesses. An important aspect of the Women to Women program is teaching the women how to use the Internet to meet their support and informational needs. The purposes of this article are to examine changes in 1) the level of computer skills, 2) degree of comfort in using the computer, and 3) knowledge of Internet functions for the participants in the Women to Women computer-based intervention. Results of the initial analysis of data from 63 women (intervention group n = 29, control group n = 34) indicate that women participating in the intervention reported greater computer skills and computer comfort and greater knowledge of specific aspects of Internet use than women in the control group. These findings were further strengthened considering that intervention and control group differentials were sustained 8 months after the end of the women's participation in the computer intervention. With the attainment of computer and Internet skills, it is expected that these rural women will have a sustained ability to access quality Internet information that will allow them to better manage and adapt to their chronic illnesses. PMID:16165009

  7. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  8. The Advantages and Disadvantages of Computer Technology in Second Language Acquisition

    ERIC Educational Resources Information Center

    Lai, Cheng-Chieh; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to discuss the advantages and disadvantages of computer technology and Computer Assisted Language Learning (CALL) programs for current second language learning. According to the National Clearinghouse for English Language Acquisition & Language Instruction Educational Programs' report (2002), more than nine million…

  9. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  10. Improved understanding of aftershock triggering by waveform detection of aftershocks with GPU computing

    NASA Astrophysics Data System (ADS)

    Peng, Z.; Meng, X.; Hong, B.; Yu, X.

    2012-12-01

    Large shallow earthquakes are generally followed by increased seismic activities around the mainshock rupture zone, known as "aftershocks". Whether static or dynamic triggering is responsible for triggering aftershocks is still in debate. However, aftershocks listed in standard earthquake catalogs are generally incomplete immediately after the mainshock, which may result in inaccurate estimation of seismic rate changes. Recent studies have used waveforms of existing earthquakes as templates to scan through continuous waveforms to detect potential missing aftershocks, which is termed 'matched filter technique'. However, this kind of data mining is computationally intensive, which raises new challenges when applying to large data sets with tens of thousands of templates, hundreds of seismic stations and years of continuous waveforms. The waveform matched filter technique exhibits parallelism at multiple levels, which allows us to use GPU-based computation to achieve significant acceleration. By dividing the procedure into several routines and processing them in parallel, we have achieved ~40 times speedup for one Nvidia GPU card compared to sequential CPU code, and further scaled the code to multiple GPUs. We apply this paralleled code to detect potential missing aftershocks around the 2003 Mw 6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes in Central California, and around the 2010 Mw 7.2 El Mayor-Cucapah earthquake in southern California. In all these cases, we can detect several tens of times more earthquakes immediately after the mainshocks as compared with those listed in the catalogs. These newly identified earthquakes are revealing new information about the physical mechanisms responsible for triggering aftershocks in the near field. We plan to improve our code so that it can be executed in large-scale GPU clusters. Our work has the long-term goal of developing scalable methods for seismic data analysis in the context of "Big Data" challenges.

  11. Computer-assisted knowledge acquisition for hypermedia systems

    NASA Technical Reports Server (NTRS)

    Steuck, Kurt

    1990-01-01

    The usage of procedural and declarative knowledge to set up the structure or 'web' of a hypermedia environment is described. An automated knowledge acquisition tool was developed that helps a knowledge engineer elicit and represent an expert's knowledge involved in performing procedural tasks. The tool represents both procedural and prerequisite, declarative knowledge that supports each activity performed by the expert. This knowledge is output and subsequently read by a hypertext scripting language to generate the link between blank, but labeled cards. Each step of the expert's activity and each piece of supporting declarative knowledge is set up as an empty node. An instructional developer can then enter detailed instructional material concerning each step and declarative knowledge into these empty nodes. Other research is also described that facilitates the translation of knowledge from one form into a form more readily useable by computerized systems.

  12. Computer-Mediated Negotiated Interaction and Lexical Acquisition

    ERIC Educational Resources Information Center

    Smith, Bryan

    2004-01-01

    This paper reports a paired-groups experimental study, which tests the Interaction Hypothesis in a computer-mediated communicative environment. Pairs of intermediate-level nonnative speakers of English (n = 24) interacted with one another in a synchronous mode over a local area network while attempting to jointly complete jigsaw and…

  13. A Computational Model of Early Argument Structure Acquisition

    ERIC Educational Resources Information Center

    Alishahi, Afra; Stevenson, Suzanne

    2008-01-01

    How children go about learning the general regularities that govern language, as well as keeping track of the exceptions to them, remains one of the challenging open questions in the cognitive science of language. Computational modeling is an important methodology in research aimed at addressing this issue. We must determine appropriate learning…

  14. A COMPUTER CONTROL AND ACQUISITION SYSTEM FOR ATOMIC ABSORPTION DATA

    EPA Science Inventory

    A system is presented that controls and acquires data from a Perkin-Elmer 603 or similar atomic absorption spectrophotometer operating in the flame mode and equipped with a 200 place auto-sampler. The hardware consists of a PDP11 computer with minimum peripheral equipment and a s...

  15. Computer-based video instructions for acquisition of technical skills.

    PubMed

    Dubrowski, Adam; Xeroulis, George

    2005-12-01

    This study aimed to assess which type of information presented in an interactive computer-based video instruction was most frequently used by novice medical students during a 1-hour training session in instrument suturing and knot-tying skills. Custom-designed instructional software enabled tracking when a given segment of the video was accessed. The results suggest that, in the early stages of learning, trainees require guidance in proper looping techniques and placement of the knots. In accordance with motor learning theory, when setting up CD-ROM or Web-based curricula, instructors should, therefore, emphasize these steps during early stages of learning. PMID:16503567

  16. Computation and Communication Evaluation of an Authentication Mechanism for Time-Triggered Networked Control Systems.

    PubMed

    Martins, Goncalo; Moondra, Arul; Dubey, Abhishek; Bhattacharjee, Anirban; Koutsoukos, Xenofon D

    2016-01-01

    In modern networked control applications, confidentiality and integrity are important features to address in order to prevent against attacks. Moreover, network control systems are a fundamental part of the communication components of current cyber-physical systems (e.g., automotive communications). Many networked control systems employ Time-Triggered (TT) architectures that provide mechanisms enabling the exchange of precise and synchronous messages. TT systems have computation and communication constraints, and with the aim to enable secure communications in the network, it is important to evaluate the computational and communication overhead of implementing secure communication mechanisms. This paper presents a comprehensive analysis and evaluation of the effects of adding a Hash-based Message Authentication (HMAC) to TT networked control systems. The contributions of the paper include (1) the analysis and experimental validation of the communication overhead, as well as a scalability analysis that utilizes the experimental result for both wired and wireless platforms and (2) an experimental evaluation of the computational overhead of HMAC based on a kernel-level Linux implementation. An automotive application is used as an example, and the results show that it is feasible to implement a secure communication mechanism without interfering with the existing automotive controller execution times. The methods and results of the paper can be used for evaluating the performance impact of security mechanisms and, thus, for the design of secure wired and wireless TT networked control systems. PMID:27463718

  17. Behavioral and computational aspects of language and its acquisition

    NASA Astrophysics Data System (ADS)

    Edelman, Shimon; Waterfall, Heidi

    2007-12-01

    One of the greatest challenges facing the cognitive sciences is to explain what it means to know a language, and how the knowledge of language is acquired. The dominant approach to this challenge within linguistics has been to seek an efficient characterization of the wealth of documented structural properties of language in terms of a compact generative grammar-ideally, the minimal necessary set of innate, universal, exception-less, highly abstract rules that jointly generate all and only the observed phenomena and are common to all human languages. We review developmental, behavioral, and computational evidence that seems to favor an alternative view of language, according to which linguistic structures are generated by a large, open set of constructions of varying degrees of abstraction and complexity, which embody both form and meaning and are acquired through socially situated experience in a given language community, by probabilistic learning algorithms that resemble those at work in other cognitive modalities.

  18. Examining the acquisition of phonological word forms with computational experiments.

    PubMed

    Vitevitch, Michael S; Storkel, Holly L

    2013-12-01

    It has been hypothesized that known words in the lexicon strengthen newly formed representations of novel words, resulting in words with dense neighborhoods being learned more quickly than words with sparse neighborhoods. Tests of this hypothesis in a connectionist network showed that words with dense neighborhoods were learned better than words with sparse neighborhoods when the network was exposed to the words all at once (Experiment I), or gradually over time, like human word-learners (Experiment 2). This pattern was also observed despite variation in the availability of processing resources in the networks (Experiment 3). A learning advantage for words with sparse neighborhoods was observed only when the network was initially exposed to words with sparse neighborhoods and exposed to dense neighborhoods later in training (Experiment 4). The benefits of computational experiments for increasing our understanding of language processes and for the treatment of language processing disorders are discussed. PMID:24597275

  19. Automatic data-acquisition and communications computer network for fusion experiments

    SciTech Connect

    Kemper, C.O.

    1981-01-01

    A network of more than twenty computers serves the data acquisition, archiving, and analysis requirements of the ISX, EBT, and beam-line test facilities at the Fusion Division of Oak Ridge National Laboratory. The network includes PDP-8, PDP-12, PDP-11, PDP-10, and Interdata 8-32 processors, and is unified by a variety of high-speed serial and parallel communications channels. While some processors are dedicated to experimental data acquisition, and others are dedicated to later analysis and theoretical work, many processors perform a combination of acquisition, real-time analysis and display, and archiving and communications functions. A network software system has been developed which runs in each processor and automatically transports data files from point of acquisition to point or points of analysis, display, and storage, providing conversion and formatting functions are required.

  20. Experimental validation of A-mode ultrasound acquisition system for computer assisted orthopaedic surgery

    NASA Astrophysics Data System (ADS)

    De Lorenzo, Danilo; De Momi, Elena; Beretta, Elisa; Cerveri, Pietro; Perona, Franco; Ferrigno, Giancarlo

    2009-02-01

    Computer Assisted Orthopaedic Surgery (CAOS) systems improve the results and the standardization of surgical interventions. Anatomical landmarks and bone surface detection is straightforward to either register the surgical space with the pre-operative imaging space and to compute biomechanical parameters for prosthesis alignment. Surface points acquisition increases the intervention invasiveness and can be influenced by the soft tissue layer interposition (7-15mm localization errors). This study is aimed at evaluating the accuracy of a custom-made A-mode ultrasound (US) system for non invasive detection of anatomical landmarks and surfaces. A-mode solutions eliminate the necessity of US images segmentation, offers real-time signal processing and requires less invasive equipment. The system consists in a single transducer US probe optically tracked, a pulser/receiver and an FPGA-based board, which is responsible for logic control command generation and for real-time signal processing and three custom-made board (signal acquisition, blanking and synchronization). We propose a new calibration method of the US system. The experimental validation was then performed measuring the length of known-shape polymethylmethacrylate boxes filled with pure water and acquiring bone surface points on a bovine bone phantom covered with soft-tissue mimicking materials. Measurement errors were computed through MR and CT images acquisitions of the phantom. Points acquisition on bone surface with the US system demonstrated lower errors (1.2mm) than standard pointer acquisition (4.2mm).

  1. The Upgrade Path from Legacy VME to VXS Dual Star Connectivity for Large Scale Data Acquisition and Trigger Systems

    SciTech Connect

    Cuevas, C; Barbosa, F J; Dong, H; Gu, W; Jastrzembski, E; Kaneta, S R; Moffitt, B; Nganga, N; Raydo, B J; Somov, A; Taylor, W M; Wilson, J

    2011-10-01

    New instrumentation modules have been designed by Jefferson Lab and to take advantage of the higher performance and elegant backplane connectivity of the VITA 41 VXS standard. These new modules are required to meet the 200KHz trigger rates envisioned for the 12GeV experimental program. Upgrading legacy VME designs to the high speed gigabit serial extensions that VXS offers, comes with significant challenges, including electronic engineering design, plus firmware and software development issues. This paper will detail our system design approach including the critical system requirement stages, and explain the pipeline design techniques and selection criteria for the FPGA that require embedded Gigabit serial transceivers. The entire trigger system is synchronous and operates at 250MHz clock with synchronization signals, and the global trigger signals distributed to each front end readout crate via the second switch slot in the 21 slot, dual star VXS backplane. The readout of the buffered detector signals relies on 2eSST over the standard VME64x path at >200MB/s. We have achieved 20Gb/s transfer rate of trigger information within one VXS crate and will present results using production modules in a two crate test configuration with both VXS crates fully populated. The VXS trigger modules that reside in the front end crates, will be ready for production orders by the end of the 2011 fiscal year. VXS Global trigger modules are in the design stage now, and will be complete to meet the installation schedule for the 12GeV Physics program.

  2. Automation and Schema Acquisition in Learning Elementary Computer Programming: Implications for the Design of Practice.

    ERIC Educational Resources Information Center

    Van Merrienboer, Jeroen J. G.; Paas, Fred G. W. C.

    1990-01-01

    Discussion of computer programing at the secondary level focuses on automation and schema acquisition as two processes important in learning cognitive skills such as programing. Their effects on learning outcomes and transfer of training are examined, the importance of worked examples is highlighted, and instructional design principles are…

  3. General purpose interface bus for personal computers used in wind tunnel data acquisition

    NASA Technical Reports Server (NTRS)

    Puram, Chith K.

    1993-01-01

    The use of the general purpose interface bus IEEE-488 to met the special requirements of wind tunnel testing involving PCs is discussed. The gearing of instrumentation to minimize test time, the choice of software to meet computer memory constraints, the use of graphics for improved use of tunnel run time, and the choice of data acquisition equipment to remove bottlenecks are addressed.

  4. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  5. Acquisition and Generalization of Purchasing Skills Using a Video Enhanced Computer-Based Instructional Program.

    ERIC Educational Resources Information Center

    Ayres, Kevin M.; Langone, John

    2002-01-01

    Three elementary students with mental retardation used a computer-based instructional package to practice purchasing skills and the dollar plus strategy. The instructional package utilized video footage and a constant time delay procedure to facilitate skill acquisition. Skills did not generalize, although changes in purchasing behavior in the…

  6. 77 FR 44064 - Federal Acquisition Regulation: Clarification of Standards for Computer Generation of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    .... Background DoD, GSA, and NASA published a proposed rule in the Federal Register at 76 FR 79609 on December 22... Federal Register at 73 FR 51276 on September 2, 2008, by the Department of Commerce. This FIPS requirement... Federal Acquisition Regulation: Clarification of Standards for Computer Generation of Forms...

  7. 76 FR 79609 - Federal Acquisition Regulation; Clarification of Standards for Computer Generation of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... 161. FIPS 161 is being removed based on the notice posted in the Federal Register (73 FR 51276) on... Federal Acquisition Regulation; Clarification of Standards for Computer Generation of Forms AGENCY... clarifying the use of American National Standards Institute X12, as the valid standard to use for...

  8. Computer-Based Acquisitions Procedures at Tarrant County Junior College District.

    ERIC Educational Resources Information Center

    Corbin, John, Ed.

    1974-01-01

    The computer-based procedures described in this report form the basis of book acquisitions performed by the Automation and Technical Services Division in serving the Learning Resources Centers of the multi-campus Tarrant County Junior College District. The procedures, which are off-line in a batch mode, have been operational since 1968. Since 1970…

  9. Computer software design description for the integrated control and data acquisition system LDUA system

    SciTech Connect

    Aftanas, B.L.

    1998-08-12

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components.

  10. Vocabulary acquisition and verbal short-term memory: computational and neural bases.

    PubMed

    Gupta, P; MacWhinney, B

    1997-09-01

    In this paper, we explore the hypothesis that human vocabulary acquisition processes and verbal short-term memory abilities utilize a common cognitive and neural system. We begin by reviewing behavioral evidence for a shared set of processes. Next, we examine what the computational bases of such a shared system might be and how vocabulary acquisition and verbal short-term memory might be related in mechanistic terms. We examine existing computational models of vocabulary acquisition and of verbal short-term memory, concluding that they fail to adequately relate these two domains. We then propose an alternative model which accounts not only for the relationship between word learning and verbal short-term memory, but also for a wide range of phenomena in verbal short-term memory. Furthermore, this new account provides a clear statement of the relationship between the proposed system and mechanisms of language processing more generally. We then consider possible neural substrates for this cognitive system. We begin by reviewing what is known of the neural substrates of speech processing and outline a conceptual framework within which a variety of seemingly contradictory neurophysiological and neuropsychological findings can be accommodated. The linkage of the shared system for vocabulary acquisition and verbal short-term memory to neural areas specifically involved in speech processing lends further support to our functional-level identification of the mechanisms of vocabulary acquisition and verbal short-term memory with those of language processing. The present work thus relates vocabulary acquisition and verbal short-term memory to each other and to speech processing, at a cognitive, computational, and neural level. PMID:9299067

  11. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  12. The digital data acquisition chain and the cosmic ray trigger system for the SLD Warm Iron Calorimeter

    SciTech Connect

    Benvenuti, A.; Piemontese, L.; Calcaterra, A.; De Sangro, R.; De Simone, P.; Burrows, P.N.; Cartwright, S.L.; Gonzales, S.; Lath, A.; Schneekloth, U.; Williams, D.C.; Yamartino, J.M.; Bacchetta, N.; Bisello, D.; Castro, A.; Galvagni, S.; Loreti, M.; Pescara, L.; Wyss, J.; Alpat, B.; Bilei, G.M.; Checcucci, B.; Dell'Orso, R.; Pauluzzi, M.; Servoli, L.; Carpinelli, M.; Castaldi, R.; Cazzola, U.; Vannini, C.; Verdini, P.G.; Messn

    1989-08-01

    The entire data-acquisition chain, from the custom-made front-end electronics to the Fastbus readout and data-reduction module, for the digital readout of the SLD limited streamer tube Warm Iron Calorimeter and Muon Identifier is described. Also described is a Fastbus Cosmic Logic Unit being developed to achieve the capability of reading cosmic ray events, also during the inter-crossing time, for apparatus monitoring and calibration purposes. 9 refs., 9 figs.

  13. High Speed Data Acquisition System for Three-Dimensional X-Ray and Neutron Computed Tomography

    SciTech Connect

    Davis, A.W.; Claytor, T.N.; Sheats, M.J.

    1999-07-01

    Computed tomography for nondestructive evaluation applications has been limited by system cost, resolution, and time requirements for three-dimensional data sets. FlashCT (Flat panel Amorphous Silicon High-Resolution Computed Tomography) is a system developed at Los Alamos National Laboratory to address these three problems. Developed around a flat panel amorphous silicon detector array, FlashCT is suitable for low to medium energy x-ray and neutron computed tomography at 127-micron resolution. Overall system size is small, allowing rapid transportation to a variety of radiographic sources. System control software was developed in LabVIEW for Windows NT to allow multithreading of data acquisition, data correction, and staging motor control. The system control software simplifies data collection and allows fully automated control of the data acquisition process, leading toward remote or unattended operation. The first generation of the FlashCT Data Acquisition System was completed in Au gust 1998, and since that time the system has been tested using x-ray sources ranging in energy from 60 kV to 20MV. The system has also been used to collect data for thermal neutron computed tomography at Los Alamos Neutron Science Center (LANSCE). System improvements have been proposed to provide faster data collection and greater dynamic range during data collection.

  14. Step-and-shoot data acquisition and reconstruction for cardiac x-ray computed tomography

    SciTech Connect

    Hsieh Jiang; Londt, John; Vass, Melissa; Li, Jay; Tang Xiangyang; Okerlund, Darin

    2006-11-15

    Coronary artery imaging with x-ray computed tomography (CT) is one of the most recent advancements in CT clinical applications. Although existing ''state-of-the-art'' clinical protocols today utilize helical data acquisition, it suffers from the lack of ability to handle irregular heart rate and relatively high x-ray dose to patients. In this paper, we propose a step-and-shoot data acquisition protocol that significantly overcomes these shortcomings. The key to the proposed protocol is the large volume coverage (40 mm) enabled by the cone beam CT scanner, which allows the coverage of the entire heart in 3 to 4 steps. In addition, we propose a gated complementary reconstruction algorithm that overcomes the longitudinal truncation problem resulting from the cone beam geometry. Computer simulations, phantom experiments, and clinical studies were conducted to validate our approach.

  15. Detector array control and triggering

    SciTech Connect

    Aiello, S.; Anzalone, A.; Bartolucci, M. |

    1998-08-01

    A commercial DSP-based board installed in a host-PC was employed for the fast, on-line and real-time computation of special algorithms, in order to perform event selection and operate as a 2nd level trigger. Moreover an ad hoc build interface, realized using PLDs with a view to connecting the DSP-board to the ADCs and to the data acquisition system, has been tested in order to evaluate the performances of these programmable devices used as a look-up-table and as a decisional part of a 1st level trigger.

  16. GPUs for real-time processing in HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Deri, L.; Fiorini, M.; Frezza, O.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Messina, A.; Sozzi, M.; Pantaleo, F.; Paolucci, Ps; Rossetti, D.; Simula, F.; Tosoratto, L.; Vicini, P.; Gap Collaboration

    2014-06-01

    We describe a pilot project (GAP - GPU Application Project) for the use of GPUs (Graphics processing units) for online triggering applications in High Energy Physics experiments. Two major trends can be identified in the development of trigger and DAQ systems for particle physics experiments: the massive use of general-purpose commodity systems such as commercial multicore PC farms for data acquisition, and the reduction of trigger levels implemented in hardware, towards a fully software data selection system ("trigger-less"). The innovative approach presented here aims at exploiting the parallel computing power of commercial GPUs to perform fast computations in software not only in high level trigger levels but also in early trigger stages. General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerators in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughputs, the use of such devices for real-time applications in high energy physics data acquisition and trigger systems is becoming relevant. We discuss in detail the use of online parallel computing on GPUs for synchronous low-level triggers with fixed latency. In particular we show preliminary results on a first test in the CERN NA62 experiment. The use of GPUs in high level triggers is also considered, the CERN ATLAS experiment being taken as a case study of possible applications.

  17. GPUs for real-time processing in HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Lamanna, G.; Ammendola, R.; Bauce, M.; Biagioni, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Graverini, E.; Lamanna, G.; Lonardo, A.; Messina, A.; Pantaleo, F.; Paolucci, P. S.; Piandani, R.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2014-06-01

    We describe a pilot project for the use of Graphics Processing Units (GPUs) for online triggering applications in High Energy Physics (HEP) experiments. Two major trends can be identified in the development of trigger and DAQ systems for HEP experiments: the massive use of general-purpose commodity systems such as commercial multicore PC farms for data acquisition, and the reduction of trigger levels implemented in hardware, towards a pure software selection system (trigger-less). The very innovative approach presented here aims at exploiting the parallel computing power of commercial GPUs to perform fast computations in software both at low- and high-level trigger stages. General-purpose computing on GPUs is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughputs, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming very attractive. We discuss in details the use of online parallel computing on GPUs for synchronous low-level trigger with fixed latency. In particular we show preliminary results on a first test in the NA62 experiment at CERN. The use of GPUs in high-level triggers is also considered, the ATLAS experiment (and in particular the muon trigger) at CERN will be taken as a study case of possible applications.

  18. Exploitation of realistic computational anthropomorphic phantoms for the optimization of nuclear imaging acquisition and processing protocols.

    PubMed

    Loudos, George K; Papadimitroulas, Panagiotis G; Kagadis, George C

    2014-01-01

    Monte Carlo (MC) simulations play a crucial role in nuclear medical imaging since they can provide the ground truth for clinical acquisitions, by integrating and quantifing all physical parameters that affect image quality. The last decade a number of realistic computational anthropomorphic models have been developed to serve imaging, as well as other biomedical engineering applications. The combination of MC techniques with realistic computational phantoms can provide a powerful tool for pre and post processing in imaging, data analysis and dosimetry. This work aims to create a global database for simulated Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) exams and the methodology, as well as the first elements are presented. Simulations are performed using the well validated GATE opensource toolkit, standard anthropomorphic phantoms and activity distribution of various radiopharmaceuticals, derived from literature. The resulting images, projections and sinograms of each study are provided in the database and can be further exploited to evaluate processing and reconstruction algorithms. Patient studies using different characteristics are included in the database and different computational phantoms were tested for the same acquisitions. These include the XCAT, Zubal and the Virtual Family, which some of which are used for the first time in nuclear imaging. The created database will be freely available and our current work is towards its extension by simulating additional clinical pathologies. PMID:25570355

  19. Graphics Processing Units for HEP trigger systems

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.; Neri, I.; Paolucci, P. S.; Piandani, R.; Pontisso, L.; Rescigno, M.; Simula, F.; Sozzi, M.; Vicini, P.

    2016-07-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  20. Big Dee upgrade of the Doublet III diagnostic data acquisition computer system

    SciTech Connect

    McHarg, B.B. Jr.

    1983-12-01

    The Big Dee upgrade of the Doublet III tokamak facility will begin operation in 1986 with an initial quantity of data expected to be 10 megabytes per shot and eventually attaining 20 to 25 megabytes per shot. This is in comparison to the 4 to 5 megabytes of data currently acquired. To handle this greater quantity of data and to serve physics needs for significantly improved between-shot processing of data will require a substantial upgrade of the existing data acquisition system. The key points of the philosophy that have been adopted for the upgraded system to handle the greater quantity of data are (1) preserve existing hardware; (2) preserve existing software; (3) configure the system in a modular fashion; and (4) distribute the data acquisition over multiple computers. The existing system using ModComp CLASSIC 16 bit minicomputers is capable of handling 5 megabytes of data per shot.

  1. Computer control and data acquisition system for the R. F. Test Facility

    SciTech Connect

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper.

  2. Acquisition of electroencephalographic data in a large regional hospital - Bringing the brain waves to the computer.

    NASA Technical Reports Server (NTRS)

    Low, M. D.; Baker, M.; Ferguson, R.; Frost, J. D., Jr.

    1972-01-01

    This paper describes a complete electroencephalographic acquisition and transmission system, designed to meet the needs of a large hospital with multiple critical care patient monitoring units. The system provides rapid and prolonged access to a centralized recording and computing area from remote locations within the hospital complex, and from locations in other hospitals and other cities. The system includes quick-on electrode caps, amplifier units and cable transmission for access from within the hospital, and EEG digitization and telephone transmission for access from other hospitals or cities.

  3. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1). Revision 1

    SciTech Connect

    Truitt, R.W.

    1994-08-24

    This document provides descriptions of components and tasks that are involved in the computer system for the data acquisition and control of the mitigation tests conducted on waste tank SY-101 at the Hanford Nuclear Reservation. The system was designed and implemented by Los alamos National Laboratory and supplied to Westinghouse Hanford Company. The computers (both personal computers and specialized data-taking computers) and the software programs of the system will hereafter collectively be referred to as the DACS (Data Acquisition and Control System).

  4. A computer-controlled, on-board data acquisition system for wind-tunnel testing

    NASA Technical Reports Server (NTRS)

    Finger, H. J.; Cambra, J. M.

    1974-01-01

    A computer-controlled data acquisition system has been developed for the 40x80-foot wind tunnel at Ames Research Center. The system, consisting of several small onboard units installed in the model and a data-managing, data-displaying ground station, is capable of sampling up to 256 channels of raw data at a total sample rate of 128,000 samples/sec. Complete signal conditioning is contained within the on-board units. The sampling sequence and channel gain selection is completely random and under total control of the ground station. Outputs include a bar-graph display, digital-to-analog converters, and digital interface to the tunnel's central computer, an SEL 840MP. The system can be run stand-alone or under the control of the SEL 840MP.

  5. The economics of data acquisition computers for ST and MST radars

    NASA Technical Reports Server (NTRS)

    Watkins, B. J.

    1983-01-01

    Some low cost options for data acquisition computers for ST (stratosphere, troposphere) and MST (mesosphere, stratosphere, troposphere) are presented. The particular equipment discussed reflects choices made by the University of Alaska group but of course many other options exist. The low cost microprocessor and array processor approach presented here has several advantages because of its modularity. An inexpensive system may be configured for a minimum performance ST radar, whereas a multiprocessor and/or a multiarray processor system may be used for a higher performance MST radar. This modularity is important for a network of radars because the initial cost is minimized while future upgrades will still be possible at minimal expense. This modularity also aids in lowering the cost of software development because system expansions should rquire little software changes. The functions of the radar computer will be to obtain Doppler spectra in near real time with some minor analysis such as vector wind determination.

  6. Computer-based supervisory control and data acquisition system for the radioactive waste evaporator

    SciTech Connect

    Pope, N.G.; Schreiber, S.B.; Yarbro, S.L.; Gomez, B.G.; Nekimken, H.L.; Sanchez, D.E.; Bibeau, R.A.; Macdonald, J.M.

    1994-12-01

    The evaporator process at TA-55 reduces the amount of transuranic liquid radioactive waste by separating radioactive salts from relatively low-level radioactive nitric acid solution. A computer-based supervisory control and data acquisition (SCADA) system has been installed on the process that allows the operators to easily interface with process equipment. Individual single-loop controllers in the SCADA system allow more precise process operation with less human intervention. With this system, process data can be archieved in computer files for later analysis. Data are distributed throughout the TA-55 site through a local area network so that real-time process conditions can be monitored at multiple locations. The entire system has been built using commercially available hardware and software components.

  7. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  8. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants

    PubMed Central

    Navarro, Pedro J.; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-01-01

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation. PMID:27164103

  9. Peer Review-Based Scripted Collaboration to Support Domain-Specific and Domain-General Knowledge Acquisition in Computer Science

    ERIC Educational Resources Information Center

    Demetriadis, Stavros; Egerter, Tina; Hanisch, Frank; Fischer, Frank

    2011-01-01

    This study investigates the effectiveness of using peer review in the context of scripted collaboration to foster both domain-specific and domain-general knowledge acquisition in the computer science domain. Using a one-factor design with a script and a control condition, students worked in small groups on a series of computer science problems…

  10. Second Language Vocabulary Acquisition Using a Diglot Reader or a Computer-Based Drill and Practice Program

    ERIC Educational Resources Information Center

    Christensen, Elizabeth; Merrill, Paul; Yanchar, Stephen

    2007-01-01

    This research study compares the impact of a computer-based diglot reader with that of a sophisticated, computer-based, drill and practice program on second language acquisition. The affective benefits as well as depth and breadth of vocabulary development were examined. The diglot method, originally conceived by Burling, introduces second…

  11. Quantifying the image quality and dose reduction of respiratory triggered 4D cone-beam computed tomography with patient-measured breathing

    NASA Astrophysics Data System (ADS)

    Cooper, Benjamin J.; O'Brien, Ricky T.; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J.

    2015-12-01

    Respiratory triggered four dimensional cone-beam computed tomography (RT 4D CBCT) is a novel technique that uses a patient’s respiratory signal to drive the image acquisition with the goal of imaging dose reduction without degrading image quality. This work investigates image quality and dose using patient-measured respiratory signals for RT 4D CBCT simulations. Studies were performed that simulate a 4D CBCT image acquisition using both the novel RT 4D CBCT technique and a conventional 4D CBCT technique. A set containing 111 free breathing lung cancer patient respiratory signal files was used to create 111 pairs of RT 4D CBCT and conventional 4D CBCT image sets from realistic simulations of a 4D CBCT system using a Rando phantom and the digital phantom, XCAT. Each of these image sets were compared to a ground truth dataset from which a mean absolute pixel difference (MAPD) metric was calculated to quantify the degradation of image quality. The number of projections used in each simulation was counted and was assumed as a surrogate for imaging dose. Based on 111 breathing traces, when comparing RT 4D CBCT with conventional 4D CBCT, the average image quality was reduced by 7.6% (Rando study) and 11.1% (XCAT study). However, the average imaging dose reduction was 53% based on needing fewer projections (617 on average) than conventional 4D CBCT (1320 projections). The simulation studies have demonstrated that the RT 4D CBCT method can potentially offer a 53% saving in imaging dose on average compared to conventional 4D CBCT in simulation studies using a wide range of patient-measured breathing traces with a minimal impact on image quality.

  12. SU-E-J-183: Quantifying the Image Quality and Dose Reduction of Respiratory Triggered 4D Cone-Beam Computed Tomography with Patient- Measured Breathing

    SciTech Connect

    Cooper, B; OBrien, R; Kipritidis, J; Keall, P

    2014-06-01

    Purpose: Respiratory triggered four dimensional cone-beam computed tomography (RT 4D CBCT) is a novel technique that uses a patient's respiratory signal to drive the image acquisition with the goal of imaging dose reduction without degrading image quality. This work investigates image quality and dose using patient-measured respiratory signals for RT 4D CBCT simulations instead of synthetic sinusoidal signals used in previous work. Methods: Studies were performed that simulate a 4D CBCT image acquisition using both the novel RT 4D CBCT technique and a conventional 4D CBCT technique from a database of oversampled Rando phantom CBCT projections. A database containing 111 free breathing lung cancer patient respiratory signal files was used to create 111 RT 4D CBCT and 111 conventional 4D CBCT image datasets from realistic simulations of a 4D RT CBCT system. Each of these image datasets were compared to a ground truth dataset from which a root mean square error (RMSE) metric was calculated to quantify the degradation of image quality. The number of projections used in each simulation is counted and was assumed as a surrogate for imaging dose. Results: Based on 111 breathing traces, when comparing RT 4D CBCT with conventional 4D CBCT the average image quality was reduced by 7.6%. However, the average imaging dose reduction was 53% based on needing fewer projections (617 on average) than conventional 4D CBCT (1320 projections). Conclusion: The simulation studies using a wide range of patient breathing traces have demonstrated that the RT 4D CBCT method can potentially offer a substantial saving of imaging dose of 53% on average compared to conventional 4D CBCT in simulation studies with a minimal impact on image quality. A patent application (PCT/US2012/048693) has been filed which is related to this work.

  13. Design and implementation of photoelectric rotary table data acquisition and analysis system host computer software based on VC++ and MFC

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table is mainly used in the defense industry and military fields, especially in the shooting range, target tracking, target acquisition, aerospace aspects play an important one. For range photoelectric measuring equipment field test application requirements, combined with a portable photoelectric rotary table data acquisition hardware system, software programming platform is presented based on the VC++, using MFC prepared PC interface, the realization of photoelectric turntable data acquisition, analysis and processing and debugging control. The host computer software design of serial communication and protocol, real-time data acquisition and display, real-time data curve drawing, analog acquisition, debugging guide, error analysis program, and gives the specific design method. Finally, through the photoelectric rotary table data acquisition hardware system alignment, the experimental results show that host computer software can better accomplish with lower machine data transmission, data acquisition, control and analysis, and to achieve the desired effect, the entire software system running performance is stable, flexible, strong practicality and reliability, the advantages of good scalability.

  14. On-line acquisition of data from Raman and infrared spectrometers with a time-sharing computer.

    PubMed

    Scherer, J R; Kint, S

    1970-07-01

    Considerations leading to the interface and software design of an on-line data acquisition system are presented. The system makes extensive use of conversation mode programing and the nature and extent of the operator-computer program interactions are discussed. The system employs close-loop control of the acquisition process with the possibility of operator intervention at any stage. Control actions and data reduction may be initiated asynchronous to the data acquisition process. An illustration of the use of the system for correcting relative Raman intensities is given. PMID:20076431

  15. An interactive computer simulator of the circulation for knowledge acquisition in cardio-anesthesia.

    PubMed

    Popp, H J; Schecke, T; Rau, G; Käsmacher, H; Kalff, G

    1991-01-01

    Knowledge-based decision support systems for use in cardio-anesthesia can provide online support to the anesthesiologist by generating intelligent alarms. However, the acquisition and validation of a consistent knowledge base for this application bears problems related to the transfer of clinical experiences into a rule system. An interactive simulator of the human circulation is presented that supports the process of knowledge acquisition and testing. The simulator can be controlled in realtime by an anesthesiologist during the simulation run thus providing a basis for interdisciplinary discussion of routine as well as critical situations. The output data can be transferred to a knowledge-based system for test purposes. The simulator is currently being used for the development of the Anesthesia Expert Assist System AES-2. With regard to the special application a model of the heart-function was integrated which enables the simulation of heart insufficiency. Simulation runs under various conditions are presented and discussed. The simulator was implemented on an ATARI ST personal computer. PMID:1779177

  16. Computer programs for the acquisition and analysis of eddy-current array probe data

    SciTech Connect

    Pate, J.R.; Dodd, C.V.

    1996-07-01

    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC`s mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided.

  17. A computational model associating learning process, word attributes, and age of acquisition.

    PubMed

    Hidaka, Shohei

    2013-01-01

    We propose a new model-based approach linking word learning to the age of acquisition (AoA) of words; a new computational tool for understanding the relationships among word learning processes, psychological attributes, and word AoAs as measures of vocabulary growth. The computational model developed describes the distinct statistical relationships between three theoretical factors underpinning word learning and AoA distributions. Simply put, this model formulates how different learning processes, characterized by change in learning rate over time and/or by the number of exposures required to acquire a word, likely result in different AoA distributions depending on word type. We tested the model in three respects. The first analysis showed that the proposed model accounts for empirical AoA distributions better than a standard alternative. The second analysis demonstrated that the estimated learning parameters well predicted the psychological attributes, such as frequency and imageability, of words. The third analysis illustrated that the developmental trend predicted by our estimated learning parameters was consistent with relevant findings in the developmental literature on word learning in children. We further discuss the theoretical implications of our model-based approach. PMID:24223699

  18. Motivation in a Computer-Supported Collaborative Learning Scenario and Its Impact on Learning Activities and Knowledge Acquisition

    ERIC Educational Resources Information Center

    Schoor, Cornelia; Bannert, Maria

    2011-01-01

    Addressing a drawback in current research on computer-supported collaborative learning (CSCL), this study investigated the influence of motivation on learning activities and knowledge acquisition during CSCL. Participants' (N = 200 university students) task was to develop a handout for which they had first an individual preparing phase followed by…

  19. Pulmonary Venous Anatomy Imaging with Low-Dose, Prospectively ECG-Triggered, High-Pitch 128-Slice Dual Source Computed Tomography

    PubMed Central

    Thai, Wai-ee; Wai, Bryan; Lin, Kaity; Cheng, Teresa; Heist, E. Kevin; Hoffmann, Udo; Singh, Jagmeet; Truong, Quynh A.

    2012-01-01

    Background Efforts to reduce radiation from cardiac computed tomography (CT) are essential. Using a prospectively triggered, high-pitch dual source CT (DSCT) protocol, we aim to determine the radiation dose and image quality (IQ) in patients undergoing pulmonary vein (PV) imaging. Methods and Results In 94 patients (61±9 years, 71% male) who underwent 128-slice DSCT (pitch 3.4), radiation dose and IQ were assessed and compared between 69 patients in sinus rhythm (SR) and 25 in atrial fibrillation (AF). Radiation dose was compared in a subset of 19 patients with prior retrospective or prospectively triggered CT PV scans without high-pitch. In a subset of 18 patients with prior magnetic resonance imaging (MRI) for PV assessment, PV anatomy and scan duration were compared to high-pitch CT. Using the high-pitch protocol, total effective radiation dose was 1.4 [1.3, 1.9] mSv, with no difference between SR and AF (1.4 vs 1.5 mSv, p=0.22). No high-pitch CT scans were non-diagnostic or had poor IQ. Radiation dose was reduced with high-pitch (1.6 mSv) compared to standard protocols (19.3 mSv, p<0.0001). This radiation dose reduction was seen with SR (1.5 vs 16.7 mSv, p<0.0001) but was more profound with AF (1.9 vs 27.7 mSv, p=0.039). There was excellent agreement of PV anatomy (kappa 0.84, p<0.0001), and a shorter CT scan duration (6 minutes) compared to MRI (41 minutes, p<0.0001). Conclusions Using a high-pitch DSCT protocol, PV imaging can be performed with minimal radiation dose, short scan acquisition, and excellent IQ in patients with SR or AF. This protocol highlights the success of new cardiac CT technology to minimize radiation exposure, giving clinicians a new low-dose imaging alternative to assess PV anatomy. PMID:22586259

  20. THE EFFECT OF ANGLE SLICE ACQUISITION ON COMPUTED TOMOGRAPHIC CERVICAL VERTEBRAL COLUMN MORPHOMETRY IN GREAT DANES.

    PubMed

    Jurkoshek, Amanda M; da Costa, Ronaldo C; Martin-Vaquero, Paula

    2015-01-01

    Computed tomography (CT) scans can be acquired with the transverse images aligned either parallel to the endplates or perpendicular to the vertebral canal. The purpose of this prospective cross-sectional study was to determine the effect of angle acquisition on CT morphometric evaluation of the cervical vertebral column of Great Danes with and without cervical spondylomyelopathy. Twenty-eight Great Danes (13 normal, 15 affected) were sampled. For each dog, a set of CT images was acquired with the transverse slices aligned parallel to the endplates and another one with the transverse images aligned perpendicular to the vertebral canal. For each different set, transverse slices from the cranial, middle, and caudal aspects of the individual vertebral bodies C2-C7 were measured. Height, width, transverse area, right dorsal to left ventral height (RDLV), and left dorsal to right ventral height (LDRV) were recorded by a single observer at each location. For both affected and control dogs, significant differences between the measurements obtained from the two sets of transverse images were found only at the cranial aspect of the vertebrae (P = 0.005, P < 0.001, P < 0.001, P = 0.005, and P = 0.010 for height, width, area, RDLV, and LDRV, respectively). Measurements for the middle and caudal aspects did not differ. The funnel-shape morphology of the cervical vertebral foramina in Great Danes with stenosis of their cranial aspect may be responsible for the significant differences found. Considering that the morphometric parameters were significantly affected by CT slice angle in the current study, authors recommend that a standardized scanning protocol be followed when morphometric evaluations using CT are planned. PMID:25872964

  1. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  2. Data of NODDI diffusion metrics in the brain and computer simulation of hybrid diffusion imaging (HYDI) acquisition scheme

    PubMed Central

    Kodiweera, Chandana; Wu, Yu-Chien

    2016-01-01

    This article provides NODDI diffusion metrics in the brains of 52 healthy participants and computer simulation data to support compatibility of hybrid diffusion imaging (HYDI), “Hybrid diffusion imaging”[1] acquisition scheme in fitting neurite orientation dispersion and density imaging (NODDI) model, “NODDI: practical in vivo neurite orientation dispersion and density imaging of the human brain”[2]. HYDI is an extremely versatile diffusion magnetic resonance imaging (dMRI) technique that enables various analyzes methods using a single diffusion dataset. One of the diffusion data analysis methods is the NODDI computation, which models the brain tissue with three compartments: fast isotropic diffusion (e.g., cerebrospinal fluid), anisotropic hindered diffusion (e.g., extracellular space), and anisotropic restricted diffusion (e.g., intracellular space). The NODDI model produces microstructural metrics in the developing brain, aging brain or human brain with neurologic disorders. The first dataset provided here are the means and standard deviations of NODDI metrics in 48 white matter region-of-interest (ROI) averaging across 52 healthy participants. The second dataset provided here is the computer simulation with initial conditions guided by the first dataset as inputs and gold standard for model fitting. The computer simulation data provide a direct comparison of NODDI indices computed from the HYDI acquisition [1] to the NODDI indices computed from the originally proposed acquisition [2]. These data are related to the accompanying research article “Age Effects and Sex Differences in Human Brain White Matter of Young to Middle-Aged Adults: A DTI, NODDI, and q-Space Study” [3]. PMID:27115027

  3. Data of NODDI diffusion metrics in the brain and computer simulation of hybrid diffusion imaging (HYDI) acquisition scheme.

    PubMed

    Kodiweera, Chandana; Wu, Yu-Chien

    2016-06-01

    This article provides NODDI diffusion metrics in the brains of 52 healthy participants and computer simulation data to support compatibility of hybrid diffusion imaging (HYDI), "Hybrid diffusion imaging"[1] acquisition scheme in fitting neurite orientation dispersion and density imaging (NODDI) model, "NODDI: practical in vivo neurite orientation dispersion and density imaging of the human brain"[2]. HYDI is an extremely versatile diffusion magnetic resonance imaging (dMRI) technique that enables various analyzes methods using a single diffusion dataset. One of the diffusion data analysis methods is the NODDI computation, which models the brain tissue with three compartments: fast isotropic diffusion (e.g., cerebrospinal fluid), anisotropic hindered diffusion (e.g., extracellular space), and anisotropic restricted diffusion (e.g., intracellular space). The NODDI model produces microstructural metrics in the developing brain, aging brain or human brain with neurologic disorders. The first dataset provided here are the means and standard deviations of NODDI metrics in 48 white matter region-of-interest (ROI) averaging across 52 healthy participants. The second dataset provided here is the computer simulation with initial conditions guided by the first dataset as inputs and gold standard for model fitting. The computer simulation data provide a direct comparison of NODDI indices computed from the HYDI acquisition [1] to the NODDI indices computed from the originally proposed acquisition [2]. These data are related to the accompanying research article "Age Effects and Sex Differences in Human Brain White Matter of Young to Middle-Aged Adults: A DTI, NODDI, and q-Space Study" [3]. PMID:27115027

  4. Hyperspeed data acquisition for 3D computer vision metrology as applied to law enforcement

    NASA Astrophysics Data System (ADS)

    Altschuler, Bruce R.

    1997-02-01

    cycling at 1 millisecond, each pattern is projected and recorded in a cycle time of 1/500th second. An entire set of patterns can then be recorded within 1/60th second. This pattern set contains all the information necessary to calculate a 3-D map. The use of hyper-speed parallel video cameras in conjunction with high speed modulators enables video data rate acquisition of all data necessary to calculate numerical digital 3-D metrological surface data. Thus a 3-D video camera can operate at the rate of a conventional 2-D video camera. The speed of actual 3-D output information is a function of the speed of the computer, a parallel processor being preferred for the task. With video rate 3-D data acquisition law enforcement could survey crime scenes, obtain evidence, watch and record people, packages, suitcases, and record disaster scenes very rapidly.

  5. Application of Chang's attenuation correction technique for single-photon emission computed tomography partial angle acquisition of Jaszczak phantom

    PubMed Central

    Saha, Krishnendu; Hoyt, Sean C.; Murray, Bryon M.

    2016-01-01

    The acquisition and processing of the Jaszczak phantom is a recommended test by the American College of Radiology for evaluation of gamma camera system performance. To produce the reconstructed phantom image for quality evaluation, attenuation correction is applied. The attenuation of counts originating from the center of the phantom is greater than that originating from the periphery of the phantom causing an artifactual appearance of inhomogeneity in the reconstructed image and complicating phantom evaluation. Chang's mathematical formulation is a common method of attenuation correction applied on most gamma cameras that do not require an external transmission source such as computed tomography, radionuclide sources installed within the gantry of the camera or a flood source. Tomographic acquisition can be obtained in two different acquisition modes for dual-detector gamma camera; one where the two detectors are at 180° configuration and acquire projection images for a full 360°, and the other where the two detectors are positioned at a 90° configuration and acquire projections for only 180°. Though Chang's attenuation correction method has been used for 360° angle acquisition, its applicability for 180° angle acquisition remains a question with one vendor's camera software producing artifacts in the images. This work investigates whether Chang's attenuation correction technique can be applied to both acquisition modes by the development of a Chang's formulation-based algorithm that is applicable to both modes. Assessment of attenuation correction performance by phantom uniformity analysis illustrates improved uniformity with the proposed algorithm (22.6%) compared to the camera software (57.6%). PMID:27051167

  6. Application of Chang's attenuation correction technique for single-photon emission computed tomography partial angle acquisition of Jaszczak phantom.

    PubMed

    Saha, Krishnendu; Hoyt, Sean C; Murray, Bryon M

    2016-01-01

    The acquisition and processing of the Jaszczak phantom is a recommended test by the American College of Radiology for evaluation of gamma camera system performance. To produce the reconstructed phantom image for quality evaluation, attenuation correction is applied. The attenuation of counts originating from the center of the phantom is greater than that originating from the periphery of the phantom causing an artifactual appearance of inhomogeneity in the reconstructed image and complicating phantom evaluation. Chang's mathematical formulation is a common method of attenuation correction applied on most gamma cameras that do not require an external transmission source such as computed tomography, radionuclide sources installed within the gantry of the camera or a flood source. Tomographic acquisition can be obtained in two different acquisition modes for dual-detector gamma camera; one where the two detectors are at 180° configuration and acquire projection images for a full 360°, and the other where the two detectors are positioned at a 90° configuration and acquire projections for only 180°. Though Chang's attenuation correction method has been used for 360° angle acquisition, its applicability for 180° angle acquisition remains a question with one vendor's camera software producing artifacts in the images. This work investigates whether Chang's attenuation correction technique can be applied to both acquisition modes by the development of a Chang's formulation-based algorithm that is applicable to both modes. Assessment of attenuation correction performance by phantom uniformity analysis illustrates improved uniformity with the proposed algorithm (22.6%) compared to the camera software (57.6%). PMID:27051167

  7. RALPH: An online computer program for acquisition and reduction of pulse height data

    NASA Technical Reports Server (NTRS)

    Davies, R. C.; Clark, R. S.; Keith, J. E.

    1973-01-01

    A background/foreground data acquisition and analysis system incorporating a high level control language was developed for acquiring both singles and dual parameter coincidence data from scintillation detectors at the Radiation Counting Laboratory at the NASA Manned Spacecraft Center in Houston, Texas. The system supports acquisition of gamma ray spectra in a 256 x 256 coincidence matrix (utilizing disk storage) and simultaneous operation of any of several background support and data analysis functions. In addition to special instruments and interfaces, the hardware consists of a PDP-9 with 24K core memory, 256K words of disk storage, and Dectape and Magtape bulk storage.

  8. The Relationship between Previous Training in Computer Science and the Acquisition of Word Processing Skills.

    ERIC Educational Resources Information Center

    Monahan, Brian D.

    1986-01-01

    This study investigated whether computer science educational background makes secondary students more adept at using word processing capabilities, and compared computer science and non-computer science students' writing improvement with word processing use. Computer science students used more sophisticated program features but student writing did…

  9. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    NASA Astrophysics Data System (ADS)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  10. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals.

    PubMed

    Felton, E A; Radwin, R G; Wilson, J A; Williams, J C

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals. PMID:19700814

  11. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  12. The Probabilistic Analysis of Language Acquisition: Theoretical, Computational, and Experimental Analysis

    ERIC Educational Resources Information Center

    Hsu, Anne S.; Chater, Nick; Vitanyi, Paul M. B.

    2011-01-01

    There is much debate over the degree to which language learning is governed by innate language-specific biases, or acquired through cognition-general principles. Here we examine the probabilistic language acquisition hypothesis on three levels: We outline a novel theoretical result showing that it is possible to learn the exact "generative model"…

  13. Synopsis of a computer program designed to interface a personal computer with the fast data acquisition system of a time-of-flight mass spectrometer

    NASA Technical Reports Server (NTRS)

    Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.

    1988-01-01

    Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.

  14. In-Depth Analysis of Computer Memory Acquisition Software for Forensic Purposes.

    PubMed

    McDown, Robert J; Varol, Cihan; Carvajal, Leonardo; Chen, Lei

    2016-01-01

    The comparison studies on random access memory (RAM) acquisition tools are either limited in metrics or the selected tools were designed to be executed in older operating systems. Therefore, this study evaluates widely used seven shareware or freeware/open source RAM acquisition forensic tools that are compatible to work with the latest 64-bit Windows operating systems. These tools' user interface capabilities, platform limitations, reporting capabilities, total execution time, shared and proprietary DLLs, modified registry keys, and invoked files during processing were compared. We observed that Windows Memory Reader and Belkasoft's Live Ram Capturer leaves the least fingerprints in memory when loaded. On the other hand, ProDiscover and FTK Imager perform poor in memory usage, processing time, DLL usage, and not-wanted artifacts introduced to the system. While Belkasoft's Live Ram Capturer is the fastest to obtain an image of the memory, Pro Discover takes the longest time to do the same job. PMID:27405017

  15. Computer system requirements specification for 101-SY hydrogen mitigation test project data acquisition and control system (DACS-1)

    SciTech Connect

    McNeece, S.G.; Truitt, R.W.

    1994-10-12

    The system requirements specification for SY-101 hydrogen mitigation test project (HMTP) data acquisition and control system (DACS-1) documents the system requirements for the DACS-1 project. The purpose of the DACS is to provide data acquisition and control capabilities for the hydrogen mitigation testing of Tank SY-101. Mitigation testing uses a pump immersed in the waste, directed at varying angles and operated at different speeds and time durations. Tank and supporting instrumentation is brought into the DACS to monitor the status of the tank and to provide information on the effectiveness of the mitigation test. Instrumentation is also provided for closed loop control of the pump operation. DACS is also capable for being expanded to control and monitor other mitigation testing. The intended audience for the computer system requirements specification includes the SY-101 hydrogen mitigation test data acquisition and control system designers: analysts, programmers, instrument engineers, operators, maintainers. It is intended for the data users: tank farm operations, mitigation test engineers, the Test Review Group (TRG), data management support staff, data analysis, Hanford data stewards, and external reviewers.

  16. A flexible high-rate USB2 data acquisition system for PET and SPECT imaging

    SciTech Connect

    J. Proffitt, W. Hammond, S. Majewski, V. Popov, R.R. Raylman, A.G. Weisenberger, R. Wojcik

    2006-02-01

    A new flexible data acquisition system has been developed to instrument gamma-ray imaging detectors designed by the Jefferson Lab Detector and Imaging Group. Hardware consists of 16-channel data acquisition modules installed on USB2 carrier boards. Carriers have been designed to accept one, two, and four modules. Application trigger rate and channel density determines the number of acquisition boards and readout computers used. Each channel has an independent trigger, gated integrator and a 2.5 MHz 12-bit ADC. Each module has an FPGA for analog control and signal processing. Processing includes a 5 ns 40-bit trigger time stamp and programmable triggering, gating, ADC timing, offset and gain correction, charge and pulse-width discrimination, sparsification, event counting, and event assembly. The carrier manages global triggering and transfers module data to a USB buffer. High-granularity time-stamped triggering is suitable for modular detectors. Time stamped events permit dynamic studies, complex offline event assembly, and high-rate distributed data acquisition. A sustained USB data rate of 20 Mbytes/s, a sustained trigger rate of 300 kHz for 32 channels, and a peak trigger rate of 2.5 MHz to FIFO memory were achieved. Different trigger, gating, processing, and event assembly techniques were explored. Target applications include >100 kHz coincidence rate PET detectors, dynamic SPECT detectors, miniature and portable gamma detectors for small-animal and clinical use.

  17. A Computational Study of the Factors Influencing the PVC-Triggering Ability of a Cluster of Early Afterdepolarization-Capable Myocytes

    PubMed Central

    Zimik, Soling; Nayak, Alok Ranjan; Pandit, Rahul

    2015-01-01

    Premature ventricular complexes (PVCs), which are abnormal impulse propagations in cardiac tissue, can develop because of various reasons including early afterdepolarizations (EADs). We show how a cluster of EAD-generating cells (EAD clump) can lead to PVCs in a model of cardiac tissue, and also investigate the factors that assist such clumps in triggering PVCs. In particular, we study, through computer simulations, the effects of the following factors on the PVC-triggering ability of an EAD clump: (1) the repolarization reserve (RR) of the EAD cells; (2) the size of the EAD clump; (3) the coupling strength between the EAD cells in the clump; and (4) the presence of fibroblasts in the EAD clump. We find that, although a low value of RR is necessary to generate EADs and hence PVCs, a very low value of RR leads to low-amplitude EAD oscillations that decay with time and do not lead to PVCs. We demonstrate that a certain threshold size of the EAD clump, or a reduction in the coupling strength between the EAD cells, in the clump, is required to trigger PVCs. We illustrate how randomly distributed inexcitable obstacles, which we use to model collagen deposits, affect PVC-triggering by an EAD clump. We show that the gap-junctional coupling of fibroblasts with myocytes can either assist or impede the PVC-triggering ability of an EAD clump, depending on the resting membrane potential of the fibroblasts and the coupling strength between the myocyte and fibroblasts. We also find that the triggering of PVCs by an EAD clump depends sensitively on factors like the pacing cycle length and the distribution pattern of the fibroblasts. PMID:26675670

  18. Calorimetry Triggering in ATLAS

    SciTech Connect

    Igonkina, O.; Achenbach, R.; Adragna, P.; Aharrouche, M.; Alexandre, G.; Andrei, V.; Anduaga, X.; Aracena, I.; Backlund, S.; Baines, J.; Barnett, B.M.; Bauss, B.; Bee, C.; Behera, P.; Bell, P.; Bendel, M.; Benslama, K.; Berry, T.; Bogaerts, A.; Bohm, C.; Bold, T.; /UC, Irvine /AGH-UST, Cracow /Birmingham U. /Barcelona, IFAE /CERN /Birmingham U. /Rutherford /Montreal U. /Santa Maria U., Valparaiso /DESY /DESY, Zeuthen /Geneva U. /City Coll., N.Y. /Barcelona, IFAE /CERN /Birmingham U. /Kirchhoff Inst. Phys. /Birmingham U. /Lisbon, LIFEP /Rio de Janeiro Federal U. /City Coll., N.Y. /Birmingham U. /Copenhagen U. /Copenhagen U. /Brookhaven /Rutherford /Royal Holloway, U. of London /Pennsylvania U. /Montreal U. /SLAC /CERN /Michigan State U. /Chile U., Catolica /City Coll., N.Y. /Oxford U. /La Plata U. /McGill U. /Mainz U., Inst. Phys. /Hamburg U. /DESY /DESY, Zeuthen /Geneva U. /Queen Mary, U. of London /CERN /Rutherford /Rio de Janeiro Federal U. /Birmingham U. /Montreal U. /CERN /Kirchhoff Inst. Phys. /Liverpool U. /Royal Holloway, U. of London /Pennsylvania U. /Kirchhoff Inst. Phys. /Geneva U. /Birmingham U. /NIKHEF, Amsterdam /Rutherford /Royal Holloway, U. of London /Rutherford /Royal Holloway, U. of London /AGH-UST, Cracow /Mainz U., Inst. Phys. /Mainz U., Inst. Phys. /Birmingham U. /Hamburg U. /DESY /DESY, Zeuthen /Geneva U. /Kirchhoff Inst. Phys. /Michigan State U. /Stockholm U. /Stockholm U. /Birmingham U. /CERN /Montreal U. /Stockholm U. /Arizona U. /Regina U. /Regina U. /Rutherford /NIKHEF, Amsterdam /Kirchhoff Inst. Phys. /DESY /DESY, Zeuthen /City Coll., N.Y. /University Coll. London /Humboldt U., Berlin /Queen Mary, U. of London /Argonne /LPSC, Grenoble /Arizona U. /Kirchhoff Inst. Phys. /Birmingham U. /Antonio Narino U. /Hamburg U. /DESY /DESY, Zeuthen /Kirchhoff Inst. Phys. /Birmingham U. /Chile U., Catolica /Indiana U. /Manchester U. /Kirchhoff Inst. Phys. /Rutherford /City Coll., N.Y. /Stockholm U. /La Plata U. /Antonio Narino U. /Queen Mary, U. of London /Kirchhoff Inst. Phys. /Antonio Narino U. /Pavia U. /City Coll., N.Y. /Mainz U., Inst. Phys. /Mainz U., Inst. Phys. /Pennsylvania U. /Barcelona, IFAE /Barcelona, IFAE /Chile U., Catolica /Genoa U. /INFN, Genoa /Rutherford /Barcelona, IFAE /Nevis Labs, Columbia U. /CERN /Antonio Narino U. /McGill U. /Rutherford /Santa Maria U., Valparaiso /Rutherford /Chile U., Catolica /Brookhaven /Oregon U. /Mainz U., Inst. Phys. /Barcelona, IFAE /McGill U. /Antonio Narino U. /Antonio Narino U. /Kirchhoff Inst. Phys. /Sydney U. /Rutherford /McGill U. /McGill U. /Pavia U. /Genoa U. /INFN, Genoa /Kirchhoff Inst. Phys. /Kirchhoff Inst. Phys. /Mainz U., Inst. Phys. /Barcelona, IFAE /SLAC /Stockholm U. /Moscow State U. /Stockholm U. /Birmingham U. /Kirchhoff Inst. Phys. /DESY /DESY, Zeuthen /Birmingham U. /Geneva U. /Oregon U. /Barcelona, IFAE /University Coll. London /Royal Holloway, U. of London /Birmingham U. /Mainz U., Inst. Phys. /Birmingham U. /Birmingham U. /Oregon U. /La Plata U. /Geneva U. /Chile U., Catolica /McGill U. /Pavia U. /Barcelona, IFAE /Regina U. /Birmingham U. /Birmingham U. /Kirchhoff Inst. Phys. /Oxford U. /CERN /Kirchhoff Inst. Phys. /UC, Irvine /UC, Irvine /Wisconsin U., Madison /Rutherford /Mainz U., Inst. Phys. /CERN /Geneva U. /Copenhagen U. /City Coll., N.Y. /Wisconsin U., Madison /Rio de Janeiro Federal U. /Wisconsin U., Madison /Stockholm U. /University Coll. London

    2011-12-08

    The ATLAS experiment is preparing for data taking at 14 TeV collision energy. A rich discovery physics program is being prepared in addition to the detailed study of Standard Model processes which will be produced in abundance. The ATLAS multi-level trigger system is designed to accept one event in 2/10{sup 5} to enable the selection of rare and unusual physics events. The ATLAS calorimeter system is a precise instrument, which includes liquid Argon electro-magnetic and hadronic components as well as a scintillator-tile hadronic calorimeter. All these components are used in the various levels of the trigger system. A wide physics coverage is ensured by inclusively selecting events with candidate electrons, photons, taus, jets or those with large missing transverse energy. The commissioning of the trigger system is being performed with cosmic ray events and by replaying simulated Monte Carlo events through the trigger and data acquisition system.

  19. A synchronization method for wireless acquisition systems, application to brain computer interfaces.

    PubMed

    Foerster, M; Bonnet, S; van Langhenhove, A; Porcherot, J; Charvet, G

    2013-01-01

    A synchronization method for wireless acquisition systems has been developed and implemented on a wireless ECoG recording implant and on a wireless EEG recording helmet. The presented algorithm and hardware implementation allow the precise synchronization of several data streams from several sensor nodes for applications where timing is critical like in event-related potential (ERP) studies. The proposed method has been successfully applied to obtain visual evoked potentials and compared with a reference biosignal amplifier. The control over the exact sampling frequency allows reducing synchronization errors that will otherwise accumulate during a recording. The method is scalable to several sensor nodes communicating with a shared base station. PMID:24109816

  20. HARD REAL TIME QUICK EXAFS DATA ACQUISITION WITH ALL OPEN SOURCE SOFTWARE ON A COMMODITY PERSONAL COMPUTER.

    SciTech Connect

    SO,I.; SIDDONS, D.P.; CALIEBE, W.A.; KHALID, S.

    2007-04-25

    We describe here the data acquisition subsystem of the Quick EXAFS (QEXAFS) experiment at the National Synchrotron Light Source of Brookhaven National Laboratory. For ease of future growth and flexibility, almost all software components are open source with very active maintainers. Among them, Linux running on x86 desktop computer, RTAI for real-time response, COMEDI driver for the data acquisition hardware, Qt and PyQt for graphical user interface, PyQwt for plotting, and Python for scripting. The signal (A/D) and energy-reading (IK220 encoder) devices in the PCI computer are also EPICS enabled. The control system scans the monochromator energy through a networked EPICS motor. With the real-time kernel, the system is capable of deterministic data-sampling period of tens of micro-seconds with typical timing-jitter of several micro-seconds. At the same time, Linux is running in other non-real-time processes handling the user-interface. A modern Qt-based controls-front end enhances productivity. The fast plotting and zooming of data in time or energy coordinates let the experimenters verify the quality of the data before detailed analysis. Python scripting is built-in for automation. The typical data-rate for continuous runs are around ten mega-bytes per minute.

  1. A Human-Computer Partnership: The Tutor/Child/Computer Triangle Promoting the Acquisition of Early Literacy Skills

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Miodrag, Nancy; Di Francesco, Nathalie

    2008-01-01

    This study involved the analysis of the complex interactions that take place between tutors and preschool children using a computer during early literacy tutoring sessions. Eight five-year-old pre- and early-readers attending a childcare centre participated in daily 20-minute tutoring sessions for two weeks. The literacy software (a beta version)…

  2. Neurolinguistics and psycholinguistics as a basis for computer acquisition of natural language

    SciTech Connect

    Powers, D.M.W.

    1983-04-01

    Research into natural language understanding systems for computers has concentrated on implementing particular grammars and grammatical models of the language concerned. This paper presents a rationale for research into natural language understanding systems based on neurological and psychological principles. Important features of the approach are that it seeks to place the onus of learning the language on the computer, and that it seeks to make use of the vast wealth of relevant psycholinguistic and neurolinguistic theory. 22 references.

  3. TOTEM Trigger System Firmware

    NASA Astrophysics Data System (ADS)

    Kopal, Josef

    2014-06-01

    This paper describes the TOTEM Trigger System Firmware that is operational at LHC since 2009. The TOTEM experiment is devoted to the forward hadronic physics at collision energy from 2.7 to 14TeV. It is composed of three different subdetectors that are placed at 9, 13.5, and 220m from the Interaction Point 5. A time-critical-logic firmware is implemented inside FPGA circuits to review collisions and to select the relevant ones to be stored by the Data Acquisition (DAQ). The Trigger system has been modified in the 2012-2013 LHC runs allowing the experiment to take data in cooperation with CMS.

  4. Experiments with a low-cost system for computer graphics material model acquisition

    NASA Astrophysics Data System (ADS)

    Rushmeier, Holly; Lockerman, Yitzhak; Cartwright, Luke; Pitera, David

    2015-03-01

    We consider the design of an inexpensive system for acquiring material models for computer graphics rendering applications in animation, games and conceptual design. To be useful in these applications a system must be able to model a rich range of appearances in a computationally tractable form. The range of appearance of interest in computer graphics includes materials that have spatially varying properties, directionality, small-scale geometric structure, and subsurface scattering. To be computationally tractable, material models for graphics must be compact, editable, and efficient to numerically evaluate for ray tracing importance sampling. To construct appropriate models for a range of interesting materials, we take the approach of separating out directly and indirectly scattered light using high spatial frequency patterns introduced by Nayar et al. in 2006. To acquire the data at low cost, we use a set of Raspberry Pi computers and cameras clamped to miniature projectors. We explore techniques to separate out surface and subsurface indirect lighting. This separation would allow the fitting of simple, and so tractable, analytical models to features of the appearance model. The goal of the system is to provide models for physically accurate renderings that are visually equivalent to viewing the original physical materials.

  5. 3D object optonumerical acquisition methods for CAD/CAM and computer graphics systems

    NASA Astrophysics Data System (ADS)

    Sitnik, Robert; Kujawinska, Malgorzata; Pawlowski, Michal E.; Woznicki, Jerzy M.

    1999-08-01

    The creation of a virtual object for CAD/CAM and computer graphics on the base of data gathered by full-field optical measurement of 3D object is presented. The experimental co- ordinates are alternatively obtained by combined fringe projection/photogrammetry based system or fringe projection/virtual markers setup. The new and fully automatic procedure which process the cloud of measured points into triangular mesh accepted by CAD/CAM and computer graphics systems is presented. Its applicability for various classes of objects is tested including the error analysis of virtual objects generated. The usefulness of the method is proved by applying the virtual object in rapid prototyping system and in computer graphics environment.

  6. Dynamic gas temperature measurements using a personal computer for data acquisition and reduction

    NASA Technical Reports Server (NTRS)

    Fralick, Gustave C.; Oberle, Lawrence G.; Greer, Lawrence C., III

    1993-01-01

    This report describes a dynamic gas temperature measurement system. It has frequency response to 1000 Hz, and can be used to measure temperatures in hot, high pressure, high velocity flows. A personal computer is used for collecting and processing data, which results in a much shorter wait for results than previously. The data collection process and the user interface are described in detail. The changes made in transporting the software from a mainframe to a personal computer are described in appendices, as is the overall theory of operation.

  7. Technical drilling data acquisition and processing with an integrated computer system

    SciTech Connect

    Chevallier, J.J.; Quetier, F.P.; Marshall, D.W.

    1986-04-01

    Sedco Forex has developed an integrated computer system to enhance the technical performance of the company at various operational levels and to increase the understanding and knowledge of the drill crews. This paper describes the system and how it is used for recording and processing drilling data at the rig site, for associated technical analyses, and for well design, planning, and drilling performance studies at the operational centers. Some capabilities related to the statistical analysis of the company's operational records are also described, and future development of rig computing systems for drilling applications and management tasks is discussed.

  8. The Relationship between Second Language Acquisition Theory and Computer-Assisted Language Learning

    ERIC Educational Resources Information Center

    Chapelle, Carol A.

    2009-01-01

    The point of departure for this article is the contrast between the theoretical landscape within view of language teaching professionals in 1991 and that of today. I argue that the pragmatic goal of computer-assisted language learning (CALL) developers and researchers to create and evaluate learning opportunities pushes them to consider a variety…

  9. Efficacy of a Computer-Based Program on Acquisition of Reading Skills of Incarcerated Youth

    ERIC Educational Resources Information Center

    Shippen, Margaret E.; Morton, Rhonda Collins; Flynt, Samuel W.; Houchins, David E.; Smitherman, Tracy

    2012-01-01

    Despite the importance of literacy skill training for incarcerated youth, a very limited number of empirically based research studies have examined reading instruction in correctional facilities. The purpose of this study was to determine whether the Fast ForWord computer-assisted reading program improved the reading and spelling abilities of…

  10. Benefits of Computer-Assisted Instruction to Support Reading Acquisition in English Language Learners

    ERIC Educational Resources Information Center

    Macaruso, Paul; Rodman, Alyson

    2011-01-01

    Young children who are English language learners (ELLs) face major challenges in learning to read English. This study examined whether computer-assisted instruction (CAI) can be beneficial to ELL kindergartners enrolled in bilingual classes. The CAI programs provided systematic and structured exercises in developing phonological awareness and…

  11. Task-Based Oral Computer-Mediated Communication and L2 Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Yanguas, Inigo

    2012-01-01

    The present study adds to the computer-mediated communication (CMC) literature by exploring oral learner-to-learner interaction using Skype, a free and widely used Internet software program. In particular, this task-based study has a two-fold goal. Firstly, it explores possible differences between two modes of oral CMC (audio and video) and…

  12. Prompting Conceptual Understanding with Computer-Mediated Peer Discourse and Knowledge Acquisition Techniques

    ERIC Educational Resources Information Center

    Liu, Chen-Chung; Lee, Jia-Hsung

    2005-01-01

    Numerous computer-mediated communication (CMC) tools have been designed to facilitate group discourses on the Web. However, previous studies noted that participants did not value online conferencing as a method for conducting in-depth discussions, and instead considered this method as merely scratching the surface of the issues involved.…

  13. Exploring the Relationship between Emotions and the Acquisition of Computer Knowledge

    ERIC Educational Resources Information Center

    Kay, Robin H.

    2008-01-01

    Most computer users have to deal with major software upgrades every 6-18 months. Given the pressure of having to adjust so quickly and so often, it is reasonable to assume that users will express emotional reactions such as anger, desperation, anxiety, or relief during the learning process. To date, the primary emotion studied with respect to…

  14. Computer- and Video-Based Instruction of Food-Preparation Skills: Acquisition, Generalization, and Maintenance

    ERIC Educational Resources Information Center

    Ayres, Kevin; Cihak, David

    2010-01-01

    The purpose of this study was to evaluate the effects of a computer-based video instruction (CBVI) program to teach life skills. Three middle school-aged students with intellectual disabilities were taught how to make a sandwich, use a microwave, and set the table with a CBVI software package. A multiple probe across behaviors design was used to…

  15. Acquisition of Basic Computer Programming Concepts by Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Nachmias, Rafi; And Others

    The process through which the basic principles and concepts of computer programming language are acquired by children was investigated via the development and testing of a teaching unit for fourth and sixth grade students. This unit had three components: (1) basic programming without variables; (2) basic programming with variables; and (3)…

  16. Inquiry Based-Computational Experiment, Acquisition of Threshold Concepts and Argumentation in Science and Mathematics Education

    ERIC Educational Resources Information Center

    Psycharis, Sarantos

    2016-01-01

    Computational experiment approach considers models as the fundamental instructional units of Inquiry Based Science and Mathematics Education (IBSE) and STEM Education, where the model take the place of the "classical" experimental set-up and simulation replaces the experiment. Argumentation in IBSE and STEM education is related to the…

  17. Acquisition and Generalization of Chained Tasks Taught with Computer Based Video Instruction to Children with Autism

    ERIC Educational Resources Information Center

    Ayres, Kevin M.; Maguire, Amy; McClimon, Desiree

    2009-01-01

    Three elementary aged students with autism participated in an evaluation of computer based video instruction that targeted functional life skills. The effects of the software were analyzed in the context of a multiple probe design across and replicated across participants. This study represents a departure from more traditional video based…

  18. Reflections on the Use of Computers in Second-Language Acquisition.

    ERIC Educational Resources Information Center

    Marty, Fernand

    1981-01-01

    Conditions under which using computers can help improve the study of foreign languages are discussed. Attention is limited to a consideration of a language course that aims at giving students a high level of accuracy in listening comprehension, oral expression, reading comprehension, and written expression. The following questions are addressed:…

  19. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    SciTech Connect

    VanderLaan, J.F.; Cummings, J.W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPUs. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.

  20. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    NASA Astrophysics Data System (ADS)

    Vanderlaan, J. F.; Cummings, J. W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPU's. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.

  1. Computer-Assisted Vocabulary Acquisition: The CSLU Vocabulary Tutor in Oral-Deaf Education.

    PubMed

    Barker, Lecia J.

    2003-01-01

    Deficits in vocabulary have a negative impact on literacy and interpersonal interaction for deaf children. As part of an evaluation, an outcomes assessment was conducted to determine the effectiveness of a computer-based vocabulary tutor in an elementary auditory/oral program. Participants were 19 children, 16 profoundly deaf and 3 hearing. The vocabulary tutor displays line drawings or photographs of the words to be learned while a computer-generated avatar of a "talking head" provides synthesized audiovisual speech driven from text. The computer system also generates printed words corresponding to the imaged items. Through audiovisual reception, children memorized up to 218 new words for everyday household items. After 4 weeks, their receptive vocabulary was tested, using the avatar to speak the name of each item. Most of the students retained more than half of the new words. The freely available vocabulary tutor, whose characteristics can be tailored to individual need, can provide a language-intensive, independent learning environment to supplement classroom teaching in content areas. PMID:15448067

  2. Multispectral data acquisition and classification - Computer modeling for smart sensor design

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Davis, R. E.; Huck, F. O.; Arduini, R. F.

    1980-01-01

    In this paper a model of the processes involved in multispectral remote sensing and data classification is developed as a tool for designing and evaluating smart sensors. The model has both stochastic and deterministic elements and accounts for solar radiation, atmospheric radiative transfer, surface reflectance, sensor spectral reponses, and classification algorithms. Preliminary results are presented which indicate the validity and usefulness of this approach. Future capabilities of smart sensors will ultimately be limited by the accuracy with which multispectral remote sensing processes and their error sources can be computationally modeled.

  3. Respirator triggering of electron beam computed tomography (EBCT): evaluation of dynamic changes during mechanical expiration in the traumatized patient

    NASA Astrophysics Data System (ADS)

    Recheis, Wolfgang A.; Kleinsasser, Axel; Hatschenberger, Robert; Knapp, Rudolf; zur Nedden, Dieter; Hoermann, Christoph

    1999-05-01

    The purpose of this project is to evaluate the dynamic changes during expiration at different levels of positive end- expiratory pressure (PEEP) in the ventilated patient. We wanted to discriminate between normal lung function and acute respiratory distress syndrome (ARDS). After approval by the local Ethic Committee we studied two ventilated patients: (1) with normal lung function; (2) ARDS). We used the 50 ms scan mode of the EBCT. The beam was positioned 1 cm above the diaphragm. The table position remained unchanged. An electronic trigger was developed, that utilizes the respirators synchronizing signal to start the EBCT at the onset of expiration. During controlled mechanical expiration at two levels of PEEP (0 and 15 cm H2O), pulmonary aeration was rated as: well-aerated (-900HU/-500HU), poorly- aerated (-500HU/-100HU) and non-aerated (-100HU/+100HU). Pathological and normal lung function showed different dynamic changes (FIG.4-12). The different PEEP levels resulted in a significant change of pulmonary aeration in the same patient. Although we studied only a very limited number of patients, respirator triggered EBCT may be accurate in discriminating pathological changes due to the abnormal lung function in the mechanically ventilated patient.

  4. Data acquisition and control of a Raman spectrometer using a DEC PDP 11/34 computer

    SciTech Connect

    Armstrong, D.P.; Fletcher, W.H.; Trimble, D.S.

    1987-06-01

    The Raman spectrometer system, located at Building K-1004L, which is operated by the members of the Process Chemistry section of the Materials and Chemistry Technology Department, has recently been extensively overhauled and upgraded. A significant portion of the efforts involved in the upgrading was in the conversion to a DEC PDP 11/34 mini-computer. The necessary changes and improvements made to the laser, the optical path, the monochromator and the signal collection unit are described. The primary objective of this report is to describe the actual interfacing of the spectrometer and its interface unit to the DEC computer. In addition, the seven operating routines which were adapted for or written especially for this system are described and examples of the system's performance and flexibility are included. The resulting spectrometer system has a markedly improved performance and reliability and allows the Process Chemistry section to readily examine virtually any class of samples which may be analyzed at room temperature with argon ion laser emission lines.

  5. DQE(f) of four generations of computed radiography acquisition devices.

    PubMed

    Dobbins, J T; Ergun, D L; Rutz, L; Hinshaw, D A; Blume, H; Clark, D C

    1995-10-01

    Measurements were made of the MTF(f), NPS(f), and DQE(f) of four generations of computed radiography (CR) imaging plates and three generations of CR image readers. The MTF generally showed only a minor change between generations of plates and readers, but the DQE(f) has improved substantially from a very early plate/reader combination to a more recent one. The DQE in the more recent plate/reader combination is 1.3X greater at low frequencies and about 3X greater at high frequencies than the much earlier versions. Thus there has been substantial improvement in the imaging performance obtainable with CR since some of the early observer studies which indicated poorer performance with CR than with screen-film. PMID:8551982

  6. Image analysis in modern ophthalmology: from acquisition to computer assisted diagnosis and telemedicine

    NASA Astrophysics Data System (ADS)

    Marrugo, Andrés G.; Millán, María S.; Cristóbal, Gabriel; Gabarda, Salvador; Sorel, Michal; Sroubek, Filip

    2012-06-01

    Medical digital imaging has become a key element of modern health care procedures. It provides visual documentation and a permanent record for the patients, and most important the ability to extract information about many diseases. Modern ophthalmology thrives and develops on the advances in digital imaging and computing power. In this work we present an overview of recent image processing techniques proposed by the authors in the area of digital eye fundus photography. Our applications range from retinal image quality assessment to image restoration via blind deconvolution and visualization of structural changes in time between patient visits. All proposed within a framework for improving and assisting the medical practice and the forthcoming scenario of the information chain in telemedicine.

  7. Examining the acquisition of phonological word-forms with computational experiments

    PubMed Central

    Vitevitch, Michael S.; Storkel, Holly L.

    2012-01-01

    It has been hypothesized that known words in the lexicon strengthen newly formed representations of novel words, resulting in words with dense neighborhoods being learned more quickly than words with sparse neighborhoods. Tests of this hypothesis in a connectionist network showed that words with dense neighborhoods were learned better than words with sparse neighborhoods when the network was exposed to the words all at once (Experiment 1), or gradually over time, like human word-learners (Experiment 2). This pattern was also observed despite variation in the availability of processing resources in the networks (Experiment 3). A learning advantage for words with sparse neighborhoods was observed only when the network was initially exposed to words with sparse neighborhoods and exposed to dense neighborhoods later in training (Experiment 4). The benefits of computational experiments for increasing our understanding of language processes and for the treatment of language processing disorders are discussed. PMID:24597275

  8. Diagnostic Phase of Calcium Scoring Scan Applied as the Center of Acquisition Window of Coronary Computed Tomography Angiography Improves Image Quality in Minimal Acquisition Window Scan (Target CTA Mode) Using the Second Generation 320-Row CT

    PubMed Central

    Maeda, Eriko; Kanno, Shigeaki; Ino, Kenji; Tomizawa, Nobuo; Akahane, Masaaki; Torigoe, Rumiko; Ohtomo, Kuni

    2016-01-01

    Objective. To compare the image quality of coronary computed tomography angiography (CCTA) acquired under two conditions: 75% fixed as the acquisition window center (Group 75%) and the diagnostic phase for calcium scoring scan as the center (CS; Group CS). Methods. 320-row cardiac CT with a minimal acquisition window (scanned using “Target CTA” mode) was performed on 81 patients. In Group 75% (n = 40), CS was obtained and reconstructed at 75% and the center of the CCTA acquisition window was set at 75%. In Group CS (n = 41), CS was obtained at 75% and the diagnostic phase showing minimal artifacts was applied as the center of the CCTA acquisition window. Image quality was evaluated using a four-point scale (4-excellent) and the mean scores were compared between groups. Results. The CCTA scan diagnostic phase occurred significantly earlier in CS (75.7 ± 3.2% vs. 73.6 ± 4.5% for Groups 75% and CS, resp., p = 0.013). The mean Group CS image quality score (3.58 ± 0.63) was also higher than that for Group 75% (3.19 ± 0.66, p < 0.0001). Conclusions. The image quality of CCTA in Target CTA mode was significantly better when the center of acquisition window is adjusted using CS. PMID:26977449

  9. Operation of the Upgraded ATLAS Level-1 Central Trigger System

    NASA Astrophysics Data System (ADS)

    Glatzer, Julian

    2015-12-01

    The ATLAS Level-1 Central Trigger (L1CT) system is a central part of ATLAS data-taking and has undergone a major upgrade for Run 2 of the LHC, in order to cope with the expected increase of instantaneous luminosity of a factor of two with respect to Run 1. The upgraded hardware offers more flexibility in the trigger decisions due to the factor of two increase in the number of trigger inputs and usable trigger channels. It also provides an interface to the new topological trigger system. Operationally - particularly useful for commissioning, calibration and test runs - it allows concurrent running of up to three different subdetector combinations. An overview of the operational software framework of the L1CT system with particular emphasis on the configuration, controls and monitoring aspects is given. The software framework allows a consistent configuration with respect to the ATLAS experiment and the LHC machine, upstream and downstream trigger processors, and the data acquisition system. Trigger and dead-time rates are monitored coherently at all stages of processing and are logged by the online computing system for physics analysis, data quality assurance and operational debugging. In addition, the synchronisation of trigger inputs is watched based on bunch-by-bunch trigger information. Several software tools allow for efficient display of the relevant information in the control room in a way useful for shifters and experts. The design of the framework aims at reliability, flexibility, and robustness of the system and takes into account the operational experience gained during Run 1. The Level-1 Central Trigger was successfully operated with high efficiency during the cosmic-ray, beam-splash and first Run 2 data taking with the full ATLAS detector.

  10. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    NASA Technical Reports Server (NTRS)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  11. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  12. Diagnostic Value of Prospective Electrocardiogram-triggered Dual-source Computed Tomography Angiography for Infants and Children with Interrupted Aortic Arch

    PubMed Central

    Li, Hai-Ou; Wang, Xi-Ming; Nie, Pei; Ji, Xiao-Peng; Cheng, Zhao-Ping; Chen, Jiu-Hong; Xu, Zhuo-Dong

    2015-01-01

    Background: Accurate assessment of intra- as well as extra-cardiac malformations and radiation dosage concerns are especially crucial to infants and children with interrupted aortic arch (IAA). The purpose of this study is to investigate the value of prospective electrocardiogram (ECG)-triggered dual-source computed tomography (DSCT) angiography with low-dosage techniques in the diagnosis of IAA. Methods: Thirteen patients with suspected IAA underwent prospective ECG-triggered DSCT scan and transthoracic echocardiography (TTE). Surgery was performed on all the patients. A five-point scale was used to assess image quality. The diagnostic accuracy of DSCT angiography and TTE was compared with the surgical findings as the reference standard. A nonparametric Chi-square test was used for comparative analysis. P <0.05 was considered as a significant difference. The mean effective radiation dose (ED) was calculated. Results: Diagnostic DSCT images were obtained for all the patients. Thirteen IAA cases with 60 separate cardiovascular anomalies were confirmed by surgical findings. The diagnostic accuracy of TTE and DSCT for total cardiovascular malformations was 93.7% and 97.9% (P > 0.05), and that for extra-cardiac vascular malformations was 92.3% and 99.0% (P < 0.05), respectively. The mean score of image quality was 3.77 ± 0.83. The mean ED was 0.30 ± 0.04 mSv (range from 0.23 mSv to 0.39 mSv). Conclusions: In infants and children with IAA, prospective ECG-triggered DSCT with low radiation exposure and high diagnostic efficiency has higher accuracy compared to TTE in detection of extra-cardiac vascular anomalies. PMID:25947401

  13. Triggering Klystrons

    SciTech Connect

    Stefan, Kelton D.; /Purdue U. /SLAC

    2010-08-25

    To determine if klystrons will perform to the specifications of the LCLS (Linac Coherent Light Source) project, a new digital trigger controller is needed for the Klystron/Microwave Department Test Laboratory. The controller needed to be programmed and Windows based user interface software needed to be written to interface with the device over a USB (Universal Serial Bus). Programming the device consisted of writing logic in VHDL (VHSIC (Very High Speed Integrated Circuits) hardware description language), and the Windows interface software was written in C++. Xilinx ISE (Integrated Software Environment) was used to compile the VHDL code and program the device, and Microsoft Visual Studio 2005 was used to compile the C++ based Windows software. The device was programmed in such a way as to easily allow read/write operations to it using a simple addressing model, and Windows software was developed to interface with the device over a USB connection. A method of setting configuration registers in the trigger device is absolutely necessary to the development of a new triggering system, and the method developed will fulfill this need adequately. More work is needed before the new trigger system is ready for use. The configuration registers in the device need to be fully integrated with the logic that will generate the RF signals, and this system will need to be tested extensively to determine if it meets the requirements for low noise trigger outputs.

  14. The CMS high level trigger

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2014-05-01

    The CMS experiment has been designed with a 2-level trigger system: the Level 1 Trigger, implemented on custom-designed electronics, and the High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running on the available computing power, the sustainable output rate, and the selection efficiency. Here we will present the performance of the main triggers used during the 2012 data taking, ranging from simpler single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We will discuss the optimisation of the triggers and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  15. The CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Trocino, Daniele

    2014-06-01

    The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented in custom-designed electronics, and the High-Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running with the available computing power, the sustainable output rate, and the selection efficiency. We present the performance of the main triggers used during the 2012 data taking, ranging from simple single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We discuss the optimisation of the trigger and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  16. Computational Evidence that Frequency Trajectory Theory Does Not Oppose but Emerges from Age-of-Acquisition Theory

    ERIC Educational Resources Information Center

    Mermillod, Martial; Bonin, Patrick; Meot, Alain; Ferrand, Ludovic; Paindavoine, Michel

    2012-01-01

    According to the age-of-acquisition hypothesis, words acquired early in life are processed faster and more accurately than words acquired later. Connectionist models have begun to explore the influence of the age/order of acquisition of items (and also their frequency of encounter). This study attempts to reconcile two different methodological and…

  17. Increasing session-to-session transfer in a brain-computer interface with on-site background noise acquisition

    NASA Astrophysics Data System (ADS)

    Cho, Hohyun; Ahn, Minkyu; Kim, Kiwoong; Jun, Sung Chan

    2015-12-01

    Objective. A brain-computer interface (BCI) usually requires a time-consuming training phase during which data are collected and used to generate a classifier. Because brain signals vary dynamically over time (and even over sessions), this training phase may be necessary each time the BCI system is used, which is impractical. However, the variability in background noise, which is less dependent on a control signal, may dominate the dynamics of brain signals. Therefore, we hypothesized that an understanding of variations in background noise may allow existing data to be reused by incorporating the noise characteristics into the feature extraction framework; in this way, new session data are not required each time and this increases the feasibility of the BCI systems. Approach. In this work, we collected background noise during a single, brief on-site acquisition session (approximately 3 min) immediately before a new session, and we found that variations in background noise were predictable to some extent. Then we implemented this simple session-to-session transfer strategy with a regularized spatiotemporal filter (RSTF), and we tested it with a total of 20 cross-session datasets collected over multiple days from 12 subjects. We also proposed and tested a bias correction (BC) in the RSTF. Main results. We found that our proposed session-to-session strategies yielded a slightly less or comparable performance to the conventional paradigm (each session training phase is needed with an on-site training dataset). Furthermore, using an RSTF only and an RSTF with a BC outperformed existing approaches in session-to-session transfers. Significance. We inferred from our results that, with an on-site background noise suppression feature extractor and pre-existing training data, further training time may be unnecessary.

  18. Cone-Beam Computed Tomography: Imaging Dose during CBCT Scan Acquisition and Accuracy of CBCT Based Dose Calculations

    NASA Astrophysics Data System (ADS)

    Giles, David Matthew

    Cone beam computed tomography (CBCT) is a recent development in radiotherapy for use in image guidance. Image guided radiotherapy using CBCT allows visualization of soft tissue targets and critical structures prior to treatment. Dose escalation is made possible by accurately localizing the target volume while reducing normal tissue toxicity. The kilovoltage x-rays of the cone beam imaging system contribute additional dose to the patient. In this study a 2D reference radiochromic film dosimetry method employing GAFCHROMIC(TM) model XR-QA film is used to measure point skin doses and dose profiles from the Elekta XVI CBCT system integrated onto the Synergy linac. The soft tissue contrast of the daily CBCT images makes adaptive radiotherapy possible in the clinic. In order to track dose to the patient or utilize on-line replanning for adaptive radiotherapy the CBCT images must be used to calculate dose. A Hounsfield unit calibration method for scatter correction is investigated for heterogeneity corrected dose calculation in CBCT images. Three Hounsfield unit to density calibration tables are used for each of four cases including patients and an anthropomorphic phantom, and the calculated dose from each is compared to results from the clinical standard fan beam CT. The dose from the scan acquisition is reported and the effect of scan geometry and total output of the x-ray tube on dose magnitude and distribution is shown. The ability to calculate dose with CBCT is shown to improve with the use of patient specific density tables for scatter correction, and for high beam energies the calculated dose agreement is within 1%.

  19. CEBAF Distributed Data Acquisition System

    SciTech Connect

    Trent Allison; Thomas Powers

    2005-05-01

    There are thousands of signals distributed throughout Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF) that are useful for troubleshooting and identifying instabilities. Many of these signals are only available locally or monitored by systems with small bandwidths that cannot identify fast transients. The Distributed Data Acquisition (Dist DAQ) system will sample and record these signals simultaneously at rates up to 40 Msps. Its primary function will be to provide waveform records from signals throughout CEBAF to the Experimental Physics and Industrial Control System (EPICS). The waveforms will be collected after the occurrence of an event trigger. These triggers will be derived from signals such as periodic timers or accelerator faults. The waveform data can then be processed to quickly identify beam transport issues, thus reducing down time and increasing CEBAF performance. The Dist DAQ system will be comprised of multiple standalone chassis distributed throughout CEBAF. They will be interconnected via a fiber optic network to facilitate the global triggering of events. All of the chassis will also be connected directly to the CEBAF Ethernet and run EPICS locally. This allows for more flexibility than the typical configuration of a single board computer and other custom printed circuit boards (PCB) installed in a card cage.

  20. Design and development of an automated, portable and handheld tablet personal computer-based data acquisition system for monitoring electromyography signals during rehabilitation.

    PubMed

    Ahamed, Nizam U; Sundaraj, Kenneth; Poo, Tarn S

    2013-03-01

    This article describes the design of a robust, inexpensive, easy-to-use, small, and portable online electromyography acquisition system for monitoring electromyography signals during rehabilitation. This single-channel (one-muscle) system was connected via the universal serial bus port to a programmable Windows operating system handheld tablet personal computer for storage and analysis of the data by the end user. The raw electromyography signals were amplified in order to convert them to an observable scale. The inherent noise of 50 Hz (Malaysia) from power lines electromagnetic interference was then eliminated using a single-hybrid IC notch filter. These signals were sampled by a signal processing module and converted into 24-bit digital data. An algorithm was developed and programmed to transmit the digital data to the computer, where it was reassembled and displayed in the computer using software. Finally, the following device was furnished with the graphical user interface to display the online muscle strength streaming signal in a handheld tablet personal computer. This battery-operated system was tested on the biceps brachii muscles of 20 healthy subjects, and the results were compared to those obtained with a commercial single-channel (one-muscle) electromyography acquisition system. The results obtained using the developed device when compared to those obtained from a commercially available physiological signal monitoring system for activities involving muscle contractions were found to be comparable (the comparison of various statistical parameters) between male and female subjects. In addition, the key advantage of this developed system over the conventional desktop personal computer-based acquisition systems is its portability due to the use of a tablet personal computer in which the results are accessible graphically as well as stored in text (comma-separated value) form. PMID:23662342

  1. Computers in the Instructional Process in Distance Education--Examining Relationships between Usage, Expectations and Software Acquisition.

    ERIC Educational Resources Information Center

    Clayton, Debbie; Arger, Geoff

    1989-01-01

    Presents results of a survey that was conducted to examine the use of computers in the instructional process of Australian distance education. Highlights include computer-managed learning; computer-marked tests; computer-aided learning; expert systems; and relationships between the type of computer usage, the instructors' expectations, and…

  2. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    SciTech Connect

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  3. QR-on-a-chip: a computer-recognizable micro-pattern engraved microfluidic device for high-throughput image acquisition.

    PubMed

    Yun, Kyungwon; Lee, Hyunjae; Bang, Hyunwoo; Jeon, Noo Li

    2016-02-21

    This study proposes a novel way to achieve high-throughput image acquisition based on a computer-recognizable micro-pattern implemented on a microfluidic device. We integrated the QR code, a two-dimensional barcode system, onto the microfluidic device to simplify imaging of multiple ROIs (regions of interest). A standard QR code pattern was modified to arrays of cylindrical structures of polydimethylsiloxane (PDMS). Utilizing the recognition of the micro-pattern, the proposed system enables: (1) device identification, which allows referencing additional information of the device, such as device imaging sequences or the ROIs and (2) composing a coordinate system for an arbitrarily located microfluidic device with respect to the stage. Based on these functionalities, the proposed method performs one-step high-throughput imaging for data acquisition in microfluidic devices without further manual exploration and locating of the desired ROIs. In our experience, the proposed method significantly reduced the time for the preparation of an acquisition. We expect that the method will innovatively improve the prototype device data acquisition and analysis. PMID:26728124

  4. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer...

  5. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer...

  6. 48 CFR 1552.239-103 - Acquisition of Energy Star Compliant Microcomputers, Including Personal Computers, Monitors and...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Compliant Microcomputers, Including Personal Computers, Monitors and Printers. 1552.239-103 Section 1552.239... Star Compliant Microcomputers, Including Personal Computers, Monitors and Printers. As prescribed in... Personal Computers, Monitors, and Printers (APR 1996) (a) The Contractor shall provide computer...

  7. The polar phase response property of monopolar ECG voltages using a Computer-Aided Design and Drafting (CAD)-based data acquisition system.

    PubMed

    Goswami, B; Mitra, M; Nag, B; Mitra, T K

    1993-11-01

    The present paper discusses a Computer-Aided Design and Drafting (CAD) based data acquisition and polar phase response study of the ECG. The scalar ECG does not show vector properties although such properties are embedded in it. In the present paper the polar phase response property of monopolar chest lead (V1 to V6) ECG voltages has been studied. A software tool has been used to evaluate the relative phase response of ECG voltages. The data acquisition of monopolar ECG records of chest leads V1 to V6 from the chart recorder has been done with the help of the AutoCAD application package. The spin harmonic constituents of ECG voltages are evaluated at each harmonic plane and the polar phase responses are studied at each plane. Some interesting results have been observed in some typical cases which are discussed in the paper. PMID:8307653

  8. Optimization of acquisition parameters and accuracy of target motion trajectory for four-dimensional cone-beam computed tomography with a dynamic thorax phantom.

    PubMed

    Shimohigashi, Yoshinobu; Araki, Fujio; Maruyama, Masato; Nakaguchi, Yuji; Nakato, Kengo; Nagasue, Nozomu; Kai, Yudai

    2015-01-01

    Our purpose in this study was to evaluate the performance of four-dimensional computed tomography (4D-CBCT) and to optimize the acquisition parameters. We evaluated the relationship between the acquisition parameters of 4D-CBCT and the accuracy of the target motion trajectory using a dynamic thorax phantom. The target motion was created three dimensionally using target sizes of 2 and 3 cm, respiratory cycles of 4 and 8 s, and amplitudes of 1 and 2 cm. The 4D-CBCT data were acquired under two detector configurations: "small mode" and "medium mode". The projection data acquired with scan times ranging from 1 to 4 min were sorted into 2, 5, 10, and 15 phase bins. The accuracy of the measured target motion trajectories was evaluated by means of the root mean square error (RMSE) from the setup values. For the respiratory cycle of 4 s, the measured trajectories were within 2 mm of the setup values for all acquisition times and target sizes. Similarly, the errors for the respiratory cycle of 8 s were <4 mm. When we used 10 or more phase bins, the measured trajectory errors were within 2 mm of the setup values. The trajectory errors for the two detector configurations showed similar trends. The acquisition times for achieving an RMSE of 1 mm for target sizes of 2 and 3 cm were 2 and 1 min, respectively, for respiratory cycles of 4 s. The results obtained in this study enable optimization of the acquisition parameters for target size, respiratory cycle, and desired measurement accuracy. PMID:25287015

  9. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  10. Firearm trigger assembly

    DOEpatents

    Crandall, David L.; Watson, Richard W.

    2010-02-16

    A firearm trigger assembly for use with a firearm includes a trigger mounted to a forestock of the firearm so that the trigger is movable between a rest position and a triggering position by a forwardly placed support hand of a user. An elongated trigger member operatively associated with the trigger operates a sear assembly of the firearm when the trigger is moved to the triggering position. An action release assembly operatively associated with the firearm trigger assembly and a movable assembly of the firearm prevents the trigger from being moved to the triggering position when the movable assembly is not in the locked position.

  11. Data acquisition system

    DOEpatents

    Shapiro, Stephen L.; Mani, Sudhindra; Atlas, Eugene L.; Cords, Dieter H. W.; Holbrook, Britt

    1997-01-01

    A data acquisition circuit for a particle detection system that allows for time tagging of particles detected by the system. The particle detection system screens out background noise and discriminate between hits from scattered and unscattered particles. The detection system can also be adapted to detect a wide variety of particle types. The detection system utilizes a particle detection pixel array, each pixel containing a back-biased PIN diode, and a data acquisition pixel array. Each pixel in the particle detection pixel array is in electrical contact with a pixel in the data acquisition pixel array. In response to a particle hit, the affected PIN diodes generate a current, which is detected by the corresponding data acquisition pixels. This current is integrated to produce a voltage across a capacitor, the voltage being related to the amount of energy deposited in the pixel by the particle. The current is also used to trigger a read of the pixel hit by the particle.

  12. The use of ultrasound in acquisition of the anterior pelvic plane in computer-assisted total hip replacement: a cadaver study.

    PubMed

    Parratte, S; Kilian, P; Pauly, V; Champsaur, P; Argenson, J-N A

    2008-02-01

    We have evaluated in vitro the accuracy of percutaneous and ultrasound registration as measured in terms of errors in rotation and version relative to the bony anterior pelvic plane in computer-assisted total hip replacement, and analysed the intra- and inter-observer reliability of manual or ultrasound registration. Four clinicians were asked to perform registration of the landmarks of the anterior pelvic plane on two cadavers. Registration was performed under four different conditions of acquisition. Errors in rotation were not significant. Version errors were significant with percutaneous methods (16.2 degrees; p < 0.001 and 19.25 degrees with surgical draping; p < 0.001), but not with the ultrasound acquisition (6.2 degrees, p = 0.13). Intra-observer repeatability was achieved for all the methods. Inter-observer analysis showed acceptable agreement in the sagittal but not in the frontal plane. Ultrasound acquisition of the anterior pelvic plane was more reliable in vitro than the cutaneous digitisation currently used. PMID:18256101

  13. The NA62 trigger system

    NASA Astrophysics Data System (ADS)

    Krivda, M.; NA62 Collaboration

    2013-08-01

    The main aim of the NA62 experiment (NA62 Technical Design Report, [1]) is to study ultra-rare Kaon decays. In order to select rare events over the overwhelming background, central systems with high-performance, high bandwidth, flexibility and configurability are necessary, that minimize dead time while maximizing data collection reliability. The NA62 experiment consists of 12 sub-detector systems and several trigger and control systems, for a total channel count of less than 100,000. The GigaTracKer (GTK) has the largest number of channels (54,000), and the Liquid Krypton (LKr) calorimeter shares with it the largest raw data rate (19 GB/s). The NA62 trigger system works with 3 trigger levels. The first trigger level is based on a hardware central trigger unit, so-called L0 Trigger Processor (L0TP), and Local Trigger Units (LTU), which are all located in the experimental cavern. Other two trigger levels are based on software, and done with a computer farm located on surface. The L0TP receives information from triggering sub-detectors asynchronously via Ethernet; it processes the information, and then transmits a final trigger decision synchronously to each sub-detector through the Trigger and Timing Control (TTC) system. The interface between L0TP and the TTC system, which is used for trigger and clock distribution, is provided by the Local Trigger Unit board (LTU). The LTU can work in two modes: global and stand-alone. In the global mode, the LTU provides an interface between L0TP and TTC system. In the stand-alone mode, the LTU can fully emulate L0TP and so provides an independent way for each sub-detector for testing or calibration purposes. In addition to the emulation functionality, a further functionality is implemented that allows to synchronize the clock of the LTU with the L0TP and the TTC system. For testing and debugging purposes, a Snap Shot Memory (SSM) interface is implemented, that can work

  14. Location and acquisition of objects in unpredictable locations. [a teleoperator system with a computer for manipulator control

    NASA Technical Reports Server (NTRS)

    Sword, A. J.; Park, W. T.

    1975-01-01

    A teleoperator system with a computer for manipulator control to combine the capabilities of both man and computer to accomplish a task is described. This system allows objects in unpredictable locations to be successfully located and acquired. By using a method of characterizing the work-space together with man's ability to plan a strategy and coarsely locate an object, the computer is provided with enough information to complete the tedious part of the task. In addition, the use of voice control is shown to be a useful component of the man/machine interface.

  15. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The

  16. The CMS High Level Trigger System: Experience and Future Development

    NASA Astrophysics Data System (ADS)

    Bauer, G.; Behrens, U.; Bowen, M.; Branson, J.; Bukowiec, S.; Cittolin, S.; Coarasa, J. A.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Flossdorf, A.; Gigi, D.; Glege, F.; Gomez-Reino, R.; Hartl, C.; Hegeman, J.; Holzner, A.; Hwong, Y. L.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Polese, G.; Racz, A.; Raginel, O.; Sakulin, H.; Sani, M.; Schwick, C.; Shpakov, D.; Simon, S.; Spataru, A. C.; Sumorok, K.

    2012-12-01

    The CMS experiment at the LHC features a two-level trigger system. Events accepted by the first level trigger, at a maximum rate of 100 kHz, are read out by the Data Acquisition system (DAQ), and subsequently assembled in memory in a farm of computers running a software high-level trigger (HLT), which selects interesting events for offline storage and analysis at a rate of order few hundred Hz. The HLT algorithms consist of sequences of offline-style reconstruction and filtering modules, executed on a farm of 0(10000) CPU cores built from commodity hardware. Experience from the operation of the HLT system in the collider run 2010/2011 is reported. The current architecture of the CMS HLT, its integration with the CMS reconstruction framework and the CMS DAQ, are discussed in the light of future development. The possible short- and medium-term evolution of the HLT software infrastructure to support extensions of the HLT computing power, and to address remaining performance and maintenance issues, are discussed.

  17. Standard GANIL data acquisition system

    NASA Astrophysics Data System (ADS)

    Raine, B.; Tripon, M.; Piquet, B.

    1994-02-01

    We report on the GANIL general data acquisition system based on VME crates distributed in several experiments areas, linked to a VAX cluster by optical fibers and Ethernet for control and storage. Acquisition buses are CAMAC, FERA and VXI. We present the system configuration, experiments description procedure, and adaptation for VXI and remote controls for the 4 INDRA detector. We also discus the INDRA asynchronous electronics trigger.

  18. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1)

    SciTech Connect

    Ermi, A.M.

    1997-05-01

    Description of the Proposed Activity/REPORTABLE OCCURRENCE or PIAB: This ECN changes the computer systems design description support document describing the computers system used to control, monitor and archive the processes and outputs associated with the Hydrogen Mitigation Test Pump installed in SY-101. There is no new activity or procedure associated with the updating of this reference document. The updating of this computer system design description maintains an agreed upon documentation program initiated within the test program and carried into operations at time of turnover to maintain configuration control as outlined by design authority practicing guidelines. There are no new credible failure modes associated with the updating of information in a support description document. The failure analysis of each change was reviewed at the time of implementation of the Systems Change Request for all the processes changed. This document simply provides a history of implementation and current system status.

  19. Early Intervention with Children of Dyslexic Parents: Effects of Computer-Based Reading Instruction at Home on Literacy Acquisition

    ERIC Educational Resources Information Center

    Regtvoort, Anne G. F. M.; van der Leij, Aryan

    2007-01-01

    The hereditary basis of dyslexia makes it possible to identify children at risk early on. Pre-reading children genetically at risk received during 14 weeks a home- and computer-based training in phonemic awareness and letter-sound relationships in the context of reading instruction. At posttest training effects were found for both phonemic…

  20. Peer-Monitoring vs. Micro-Script Fading for Enhancing Knowledge Acquisition when Learning in Computer-Supported Argumentation Environments

    ERIC Educational Resources Information Center

    Bouyias, Yannis; Demetriadis, Stavros

    2012-01-01

    Research on computer-supported collaborative learning (CSCL) has strongly emphasized the value of providing student support with micro-scripts, which should withdraw (fade-out) allowing students to practice the acquired skills. However, research on fading shows conflicting results and some researchers suggest that the impact of fading is enhanced…

  1. The non-invasive Berlin Brain-Computer Interface: fast acquisition of effective performance in untrained subjects.

    PubMed

    Blankertz, Benjamin; Dornhege, Guido; Krauledat, Matthias; Müller, Klaus-Robert; Curio, Gabriel

    2007-08-15

    Brain-Computer Interface (BCI) systems establish a direct communication channel from the brain to an output device. These systems use brain signals recorded from the scalp, the surface of the cortex, or from inside the brain to enable users to control a variety of applications. BCI systems that bypass conventional motor output pathways of nerves and muscles can provide novel control options for paralyzed patients. One classical approach to establish EEG-based control is to set up a system that is controlled by a specific EEG feature which is known to be susceptible to conditioning and to let the subjects learn the voluntary control of that feature. In contrast, the Berlin Brain-Computer Interface (BBCI) uses well established motor competencies of its users and a machine learning approach to extract subject-specific patterns from high-dimensional features optimized for detecting the user's intent. Thus the long subject training is replaced by a short calibration measurement (20 min) and machine learning (1 min). We report results from a study in which 10 subjects, who had no or little experience with BCI feedback, controlled computer applications by voluntary imagination of limb movements: these intentions led to modulations of spontaneous brain activity specifically, somatotopically matched sensorimotor 7-30 Hz rhythms were diminished over pericentral cortices. The peak information transfer rate was above 35 bits per minute (bpm) for 3 subjects, above 23 bpm for two, and above 12 bpm for 3 subjects, while one subject could achieve no BCI control. Compared to other BCI systems which need longer subject training to achieve comparable results, we propose that the key to quick efficiency in the BBCI system is its flexibility due to complex but physiologically meaningful features and its adaptivity which respects the enormous inter-subject variability. PMID:17475513

  2. XSTREAM: A Highly Efficient High Speed Real-time Satellite Data Acquisition and Processing System using Heterogeneous Computing

    NASA Astrophysics Data System (ADS)

    Pramod Kumar, K.; Mahendra, P.; Ramakrishna rReddy, V.; Tirupathi, T.; Akilan, A.; Usha Devi, R.; Anuradha, R.; Ravi, N.; Solanki, S. S.; Achary, K. K.; Satish, A. L.; Anshu, C.

    2014-11-01

    In the last decade, the remote sensing community has observed a significant growth in number of satellites, sensors and their resolutions, thereby increasing the volume of data to be processed each day. Satellite data processing is a complex and time consuming activity. It consists of various tasks, such as decode, decrypt, decompress, radiometric normalization, stagger corrections, ephemeris data processing for geometric corrections etc., and finally writing of the product in the form of an image file. Each task in the processing chain is sequential in nature and has different computing needs. Conventionally the processes are cascaded in a well organized workflow to produce the data products, which are executed on general purpose high-end servers / workstations in an offline mode. Hence, these systems are considered to be ineffective for real-time applications that require quick response and just-intime decision making such as disaster management, home land security and so on. This paper discusses anovel approach to processthe data online (as the data is being acquired) using a heterogeneous computing platform namely XSTREAM which has COTS hardware of CPUs, GPUs and FPGA. This paper focuses on the process architecture, re-engineering aspects and mapping of tasks to the right computing devicewithin the XSTREAM system, which makes it an ideal cost-effective platform for acquiring, processing satellite payload data in real-time and displaying the products in original resolution for quick response. The system has been tested for IRS CARTOSAT and RESOURCESAT series of satellites which have maximum data downlink speed of 210 Mbps.

  3. Using Computers to Survey the Epidemiological, Environmental and Genetic Factors Involved in the Process of Bacteria Resistance Acquisition

    PubMed Central

    Baccala, Luiz Antonio; Nicolelis, Miguel A.L.

    1989-01-01

    The sensitivity behavior in time of several species (S.aureus, E.coli, K.pneumoniae and P.mirabilis in a total of 16334 positive cultures collected at our hospital from July 1981 to December 1986) to amikacin and gentamicin are shown to be periodic. The implications of this finding, and parameters, both epidemiological and genetic, that might be of relevance in its understanding, are discussed as being necessary characteristics of a nosocomial survey-and-control computer system in which time-series analysis techniques are of central importance.

  4. Metrology data reduction system: PDP-11/34 computer system. [MOUNT, for thermistor mount data acquisition, in FORTRAN IV

    SciTech Connect

    Cable, J.W.

    1980-04-01

    A PDP-11/34 computer system was installed to acquire, correct, scale, display, store, and process data obtained by microwave and dc area coupler controllers. The system is used to calibrate power sensors and impedance and dc measurement standards on a real-time basis. The microwave data input is from a digital voltmeter and scanner arrangement, which also may be controlled from a P-ROM control board. The dc data input is from a passive device interface or from a terminal keyboard. 5 figures, 8 tables.

  5. Jefferson Lab's Distributed Data Acquisition

    SciTech Connect

    Trent Allison; Thomas Powers

    2006-05-01

    Jefferson Lab's Continuous Electron Beam Accelerator Facility (CEBAF) occasionally experiences fast intermittent beam instabilities that are difficult to isolate and result in downtime. The Distributed Data Acquisition (Dist DAQ) system is being developed to detect and quickly locate such instabilities. It will consist of multiple Ethernet based data acquisition chassis distributed throughout the seven-eights of a mile CEBAF site. Each chassis will monitor various control system signals that are only available locally and/or monitored by systems with small bandwidths that cannot identify fast transients. The chassis will collect data at rates up to 40 Msps in circular buffers that can be frozen and unrolled after an event trigger. These triggers will be derived from signals such as periodic timers or accelerator faults and be distributed via a custom fiber optic event trigger network. This triggering scheme will allow all the data acquisition chassis to be triggered simultaneously and provide a snapshot of relevant CEBAF control signals. The data will then be automatically analyzed for frequency content and transients to determine if and where instabilities exist.

  6. The D0 level three data acquisition system

    SciTech Connect

    D. Chapin et al.

    2004-03-17

    The DZERO experiment located at Fermilab has recently started RunII with an upgraded detector. The RunII physics program requires the Data Acquisition to readout the detector at a rate of 1 KHz. Events fragments, totaling 250 KB, are readout from approximately 60 front end crates and sent to a particular farm node for Level 3 Trigger processing. A scalable system, capable of complex event routing, has been designed and implemented based on commodity components: VMIC 7750 Single Board Computers for readout, a Cisco 6509 switch for data flow, and close to 100 Linux-based PCs for high-level event filtering.

  7. Radiation dose reduction in computed tomography (CT) using a new implementation of wavelet denoising in low tube current acquisitions

    NASA Astrophysics Data System (ADS)

    Tao, Yinghua; Brunner, Stephen; Tang, Jie; Speidel, Michael; Rowley, Howard; VanLysel, Michael; Chen, Guang-Hong

    2011-03-01

    Radiation dose reduction remains at the forefront of research in computed tomography. X-ray tube parameters such as tube current can be lowered to reduce dose; however, images become prohibitively noisy when the tube current is too low. Wavelet denoising is one of many noise reduction techniques. However, traditional wavelet techniques have the tendency to create an artificial noise texture, due to the nonuniform denoising across the image, which is undesirable from a diagnostic perspective. This work presents a new implementation of wavelet denoising that is able to achieve noise reduction, while still preserving spatial resolution. Further, the proposed method has the potential to improve those unnatural noise textures. The technique was tested on both phantom and animal datasets (Catphan phantom and timeresolved swine heart scan) acquired on a GE Discovery VCT scanner. A number of tube currents were used to investigate the potential for dose reduction.

  8. An on-line method for the acquisition of medical information for computer processing as applied to radiotherapy.

    PubMed

    Möller, T R; Gustafsson, T

    1977-06-01

    Based on a structured medical record, specially designed for patients with malignant disease, an on-line data capture system has been developed. This enables the collection of virtually any type of information contained in the patient's case notes. The structure of the record is described, with actual examples. The record is typed on a typewriter terminal linked to a mini-computer. Data is recorded as code + heading + value string. The headings are identified automatically, and an internal code generated, describing the type of information. Record keeping according to the principles described was introduced in clinical routine at the department in 1971. Data collection was implemented later that year, using an off-line magnetic tape encoder (IBM MT72). The system has been developed further and converted to a versatile on-line system. The data base, collected with these systems, now contains data on about 20,000 patients. PMID:862391

  9. Myofascial trigger point pain.

    PubMed

    Jaeger, Bernadette

    2013-01-01

    Myofascial trigger point pain is an extremely prevalent cause of persistent pain disorders in all parts of the body, not just the head, neck, and face. Features include deep aching pain in any structure, referred from focally tender points in taut bands of skeletal muscle (the trigger points). Diagnosis depends on accurate palpation with 2-4 kg/cm2 of pressure for 10 to 20 seconds over the suspected trigger point to allow the referred pain pattern to develop. In the head and neck region, cervical muscle trigger points (key trigger points) often incite and perpetuate trigger points (satellite trigger points) and referred pain from masticatory muscles. Management requires identification and control of as many perpetuating factors as possible (posture, body mechanics, psychological stress or depression, poor sleep or nutrition). Trigger point therapies such as spray and stretch or trigger point injections are best used as adjunctive therapy. PMID:24864393

  10. HYPERCP data acquisition system

    SciTech Connect

    Kaplan, D.M.; Luebke, W.R.; Chakravorty, A.

    1997-12-31

    For the HyperCP experiment at Fermilab, we have assembled a data acquisition system that records on up to 45 Exabyte 8505 tape drives in parallel at up to 17 MB/s. During the beam spill, data axe acquired from the front-end digitization systems at {approx} 60 MB/s via five parallel data paths. The front-end systems achieve typical readout deadtime of {approx} 1 {mu}s per event, allowing operation at 75-kHz trigger rate with {approx_lt}30% deadtime. Event building and tapewriting are handled by 15 Motorola MVME167 processors in 5 VME crates.

  11. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  12. Asthma triggers (image)

    MedlinePlus

    ... asthma triggers are mold, pets, dust, grasses, pollen, cockroaches, odors from chemicals, and smoke from cigarettes. ... asthma triggers are mold, pets, dust, grasses, pollen, cockroaches, odors from chemicals, and smoke from cigarettes.

  13. Asthma triggers (image)

    MedlinePlus

    ... common asthma triggers are mold, pets, dust, grasses, pollen, cockroaches, odors from chemicals, and smoke from cigarettes. ... common asthma triggers are mold, pets, dust, grasses, pollen, cockroaches, odors from chemicals, and smoke from cigarettes.

  14. Software for implementing trigger algorithms on the upgraded CMS Global Trigger System

    NASA Astrophysics Data System (ADS)

    Matsushita, Takashi; Arnold, Bernhard

    2015-12-01

    The Global Trigger is the final step of the CMS Level-1 Trigger and implements a trigger menu, a set of selection requirements applied to the final list of trigger objects. The conditions for trigger object selection, with possible topological requirements on multiobject triggers, are combined by simple combinatorial logic to form the algorithms. The LHC has resumed its operation in 2015, the collision-energy will be increased to 13 TeV with the luminosity expected to go up to 2x1034 cm-2s-1. The CMS Level-1 trigger system will be upgraded to improve its performance for selecting interesting physics events and to operate within the predefined data-acquisition rate in the challenging environment expected at LHC Run 2. The Global Trigger will be re-implemented on modern FPGAs on an Advanced Mezzanine Card in MicroTCA crate. The upgraded system will benefit from the ability to process complex algorithms with DSP slices and increased processing resources with optical links running at 10 Gbit/s, enabling more algorithms at a time than previously possible and allowing CMS to be more flexible in how it handles the trigger bandwidth. In order to handle the increased complexity of the trigger menu implemented on the upgraded Global Trigger, a set of new software has been developed. The software allows a physicist to define a menu with analysis-like triggers using intuitive user interface. The menu is then realised on FPGAs with further software processing, instantiating predefined firmware blocks. The design and implementation of the software for preparing a menu for the upgraded CMS Global Trigger system are presented.

  15. Triggered Jovian radio emissions

    NASA Technical Reports Server (NTRS)

    Calvert, W.

    1985-01-01

    Certain Jovian radio emissions seem to be triggered from outside, by much weaker radio waves from the sun. Recently found in the Voyager observations near Jupiter, such triggering occurs at hectometric wavelengths during the arrival of solar radio bursts, with the triggered emissions lasting sometimes more than an hour as they slowly drifted toward higher frequencies. Like the previous discovery of similar triggered emissions at the earth, this suggests that Jupiter's emissions might also originate from natural radio lasers.

  16. The high-level trigger of ALICE

    NASA Astrophysics Data System (ADS)

    Tilsner, H.; Alt, T.; Aurbakken, K.; Grastveit, G.; Helstrup, H.; Lindenstruth, V.; Loizides, C.; Nystrand, J.; Roehrich, D.; Skaali, B.; Steinbeck, T.; Ullaland, K.; Vestbo, A.; Vik, T.

    One of the main tracking detectors of the forthcoming ALICE Experiment at the LHC is a cylindrical Time Projection Chamber (TPC) with an expected data volume of about 75 MByte per event. This data volume, in combination with the presumed maximum bandwidth of 1.2 GByte/s to the mass storage system, would limit the maximum event rate to 20 Hz. In order to achieve higher event rates, online data processing has to be applied. This implies either the detection and read-out of only those events which contain interesting physical signatures or an efficient compression of the data by modeling techniques. In order to cope with the anticipated data rate, massive parallel computing power is required. It will be provided in form of a clustered farm of SMP-nodes, based on off-the-shelf PCs, which are connected with a high bandwidth low overhead network. This High-Level Trigger (HLT) will be able to process a data rate of 25 GByte/s online. The front-end electronics of the individual sub-detectors is connected to the HLT via an optical link and a custom PCI card which is mounted in the clustered PCs. The PCI card is equipped with an FPGA necessary for the implementation of the PCI-bus protocol. Therefore, this FPGA can also be used to assist the host processor with first-level processing. The first-level processing done on the FPGA includes conventional cluster-finding for low multiplicity events and local track finding based on the Hough Transformation of the raw data for high multiplicity events. PACS: 07.05.-t Computers in experimental physics - 07.05.Hd Data acquisition: hardware and software - 29.85.+c Computer data analysis

  17. An enhanced multiwavelength ultraviolet biological trigger lidar

    NASA Astrophysics Data System (ADS)

    Achey, Alexander; Bufton, Jack; Dawson, Jeffrey; Huang, Wen; Lee, Sangmin; Mehta, Nikhil; Prasad, Coorg R.

    2004-12-01

    A compact Ultraviolet Biological Trigger Lidar (UBTL) instrument for detection and discrimination of bio-warfare-agent (BWA) simulant aerosol clouds was developed by us [Prasad, et al, 2004] using a 5mW, 375nm semiconductor UV optical source (SUVOS) laser diode. It underwent successful field tests at Dugway Proving Ground and demonstrated measurement ranges of over 300m for elastic scattering and >100m for fluorescence. The UBTL was modified during mid-2004 to enhance its detection and discrimination performance with increased range of operation and sensitivity. The major optical modifications were: 1. increase in telescope collection aperture to 200 mm diameter: 2. addition of 266nm and 977nm laser transmitters: 3. addition of three detection channels for 266nm and 977nm elastic backscatter and fluorescence centered at 330nm. Also the commercial electronics of the original UBTL were replaced with a multi-channel field programmable gate array (FPGA) chip for laser diode modulation and data acquisition that allowed simultaneous and continuous operation of the UBTL sensor on all of its transmitter and receiver wavelengths. A notebook computer was added for data display and storage. Field tests were performed during July 2004 at the Edgewood Chemical and Biological Center in Maryland to establish the enhanced performance of UBTL subsystems. Results of these tests are presented and discussed.

  18. Computing Competition for Light in the GREENLAB Model of Plant Growth: A Contribution to the Study of the Effects of Density on Resource Acquisition and Architectural Development

    PubMed Central

    Cournède, Paul-Henry; Mathieu, Amélie; Houllier, François; Barthélémy, Daniel; de Reffye, Philippe

    2008-01-01

    Background and Aims The dynamical system of plant growth GREENLAB was originally developed for individual plants, without explicitly taking into account interplant competition for light. Inspired by the competition models developed in the context of forest science for mono-specific stands, we propose to adapt the method of crown projection onto the x–y plane to GREENLAB, in order to study the effects of density on resource acquisition and on architectural development. Methods The empirical production equation of GREENLAB is extrapolated to stands by computing the exposed photosynthetic foliage area of each plant. The computation is based on the combination of Poisson models of leaf distribution for all the neighbouring plants whose crown projection surfaces overlap. To study the effects of density on architectural development, we link the proposed competition model to the model of interaction between functional growth and structural development introduced by Mathieu (2006, PhD Thesis, Ecole Centrale de Paris, France). Key Results and Conclusions The model is applied to mono-specific field crops and forest stands. For high-density crops at full cover, the model is shown to be equivalent to the classical equation of field crop production ( Howell and Musick, 1985, in Les besoins en eau des cultures; Paris: INRA Editions). However, our method is more accurate at the early stages of growth (before cover) or in the case of intermediate densities. It may potentially account for local effects, such as uneven spacing, variation in the time of plant emergence or variation in seed biomass. The application of the model to trees illustrates the expression of plant plasticity in response to competition for light. Density strongly impacts on tree architectural development through interactions with the source–sink balances during growth. The effects of density on tree height and radial growth that are commonly observed in real stands appear as emerging properties of the model

  19. Operational experience with the ALICE High Level Trigger

    NASA Astrophysics Data System (ADS)

    Szostak, Artur

    2012-12-01

    The ALICE HLT is a dedicated real-time system for online event reconstruction and triggering. Its main goal is to reduce the raw data volume read from the detectors by an order of magnitude, to fit within the available data acquisition bandwidth. This is accomplished by a combination of data compression and triggering. When HLT is enabled, data is recorded only for events selected by HLT. The combination of both approaches allows for flexible data reduction strategies. Event reconstruction places a high computational load on HLT. Thus, a large dedicated computing cluster is required, comprising 248 machines, all interconnected with InfiniBand. Running a large system like HLT in production mode proves to be a challenge. During the 2010 pp and Pb-Pb data-taking period, many problems were experienced that led to a sub-optimal operational efficiency. Lessons were learned and certain crucial changes were made to the architecture and software in preparation for the 2011 Pb-Pb run, in which HLT had a vital role performing data compression for ALICE's largest detector, the TPC. An overview of the status of the HLT and experience from the 2010/2011 production runs are presented. Emphasis is given to the overall performance, showing an improved efficiency and stability in 2011 compared to 2010, attributed to the significant improvements made to the system. Further opportunities for improvement are identified and discussed.

  20. The CMS High-Level Trigger

    SciTech Connect

    Covarelli, R.

    2009-12-17

    At the startup of the LHC, the CMS data acquisition is expected to be able to sustain an event readout rate of up to 100 kHz from the Level-1 trigger. These events will be read into a large processor farm which will run the 'High-Level Trigger'(HLT) selection algorithms and will output a rate of about 150 Hz for permanent data storage. In this report HLT performances are shown for selections based on muons, electrons, photons, jets, missing transverse energy, {tau} leptons and b quarks: expected efficiencies, background rates and CPU time consumption are reported as well as relaxation criteria foreseen for a LHC startup instantaneous luminosity.

  1. Immersive, Interactive, Web-Enabled Computer Simulation as a Trigger for Learning: The next Generation of Problem-Based Learning in Educational Leadership

    ERIC Educational Resources Information Center

    Mann, Dale; Reardon, R. M.; Becker, J. D.; Shakeshaft, C.; Bacon, Nicholas

    2011-01-01

    This paper describes the use of advanced computer technology in an innovative educational leadership program. This program integrates full-motion video scenarios that simulate the leadership challenges typically faced by principals over the course of a full school year. These scenarios require decisions that are then coupled to consequences and…

  2. The digital trigger system for the RED-100 detector

    NASA Astrophysics Data System (ADS)

    Naumov, P. P.; Akimov, D. Yu.; Belov, V. A.; Bolozdynya, A. I.; Efremenko, Yu. V.; Kaplin, V. A.

    2015-12-01

    The system for forming a trigger for the liquid xenon detector RED-100 is developed. The trigger can be generated for all types of events that the detector needs for calibration and data acquisition, including the events with a single electron of ionization. In the system, a mechanism of event detection is implemented according to which the timestamp and event type are assigned to each event. The trigger system is required in the systems searching for rare events to select and keep only the necessary information from the ADC array. The specifications and implementation of the trigger unit which provides a high efficiency of response even to low-energy events are considered.

  3. The Level 0 Trigger Processor for the NA62 experiment

    NASA Astrophysics Data System (ADS)

    Chiozzi, S.; Gamberini, E.; Gianoli, A.; Mila, G.; Neri, I.; Petrucci, F.; Soldi, D.

    2016-07-01

    In the NA62 experiment at CERN, the intense flux of particles requires a high-performance trigger for the data acquisition system. A Level 0 Trigger Processor (L0TP) was realized, performing the event selection based on trigger primitives coming from sub-detectors and reducing the trigger rate from 10 to 1 MHz. The L0TP is based on a commercial FPGA device and has been implemented in two different solutions. The performance of the two systems are highlighted and compared.

  4. The digital trigger system for the RED-100 detector

    SciTech Connect

    Naumov, P. P. Akimov, D. Yu.; Belov, V. A.; Bolozdynya, A. I.; Efremenko, Yu. V.; Kaplin, V. A.

    2015-12-15

    The system for forming a trigger for the liquid xenon detector RED-100 is developed. The trigger can be generated for all types of events that the detector needs for calibration and data acquisition, including the events with a single electron of ionization. In the system, a mechanism of event detection is implemented according to which the timestamp and event type are assigned to each event. The trigger system is required in the systems searching for rare events to select and keep only the necessary information from the ADC array. The specifications and implementation of the trigger unit which provides a high efficiency of response even to low-energy events are considered.

  5. Stay away from asthma triggers

    MedlinePlus

    Asthma triggers - stay away from; Asthma triggers - avoiding; Reactive airway disease - triggers; Bronchial asthma - triggers ... to them. Have someone who does not have asthma cut the grass, or wear a facemask if ...

  6. Commissioning of the ALICE data acquisition system

    NASA Astrophysics Data System (ADS)

    Anticic, T.; Barroso, V.; Carena, F.; Carena, W.; Chapeland, S.; Cobanoglu, O.; Dénes, E.; Divià, R.; Fuchs, U.; Kiss, T.; Makhlyueva, I.; Ozok, F.; Roukoutakis, F.; Schossmaier, K.; Soós, C.; Vyvre, P. V.; Vergara, S.

    2008-07-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). A flexible, large bandwidth Data Acquisition System (DAQ) has been designed and deployed to collect sufficient statistics in the short running time foreseen per year for heavy ions and to accommodate very different requirements originated from the 18 sub-detectors. The Data Acquisition and Test Environment (DATE) is the software framework handling the data from the detector electronics up to the mass storage. This paper reviews the DAQ software and hardware architecture, including the latest features of the final design, such as the handling of the numerous calibration procedures in a common framework. We also discuss the large scale tests conducted on the real hardware to assess the standalone DAQ performances, its interfaces with the other online systems and the extensive commissioning performed in order to be ready for cosmics data taking scheduled to start in November 2007. The test protocols followed to integrate and validate each sub-detector with DAQ and Trigger hardware synchronized by the Experiment Control System are described. Finally, we give an overview of the experiment logbook, and some operational aspects of the deployment of our computing facilities. The implementation of a Transient Data Storage able to cope with the 1.25 GB/s recorded by the event-building machines and the data quality monitoring framework are covered in separate papers.

  7. Undergraduate Laboratory Data Acquisition With a Microcomputer.

    ERIC Educational Resources Information Center

    Weston, Kenneth C.

    1981-01-01

    Describes a flexible, multichannel, highly accurate, digital data acquisition system for use with a microcomputer. Includes a description of instrumentation for computer data acquisition, data acquisition systems, software, uses in the curriculum, library interaction and action, and use in a mechanical engineering laboratory. (DS)

  8. Acquisition strategies

    SciTech Connect

    Zimmer, M.J.; Lynch, P.W. )

    1993-11-01

    Acquiring projects takes careful planning, research and consideration. Picking the right opportunities and avoiding the pitfalls will lead to a more valuable portfolio. This article describes the steps to take in evaluating an acquisition and what items need to be considered in an evaluation.

  9. BTeV trigger/DAQ innovations

    SciTech Connect

    Votava, Margaret; /Fermilab

    2005-05-01

    BTeV was a proposed high-energy physics (HEP) collider experiment designed for the study of B-physics and CP Violation at the Tevatron at Fermilab. BTeV included a large-scale, high-speed trigger and data acquisition (DAQ) system, reading data from the detector at 500 Gbytes/sec and writing data to mass storage at a rate of 200 Mbytes/sec. The design of the trigger/DAQ system was innovative while remaining realistic in terms of technical feasibility, schedule and cost. This paper will give an overview of the BTeV trigger/DAQ architecture, highlight some of the technical challenges, and describe the approach that was used to solve these challenges.

  10. Predicting the image noise level of prospective ECG-triggered coronary computed tomography angiography: quantitative measurement of thoracic component versus body mass index.

    PubMed

    Kim, Hyeongmin; Park, Chul Hwan; Han, Kyung Hwa; Kim, Tae Hoon

    2015-12-01

    We evaluated the feasibility of using quantitatively measured thoracic components, as compared to body mass index (BMI), for predicting the image noise of coronary computed tomography angiography (CCTA). One hundred subjects (M:F = 64:36; mean age, 55 ± 8.8 years) who underwent prospective electrocardiography-gated CCTA and low-dose chest computed tomography (CT) were analyzed retrospectively. The image noise of the CCTA was determined by the standard deviation of the attenuation value in a region of interest on the aortic root level. On the low-dose chest CT, the areas of the thoracic components were measured at the aortic root level. An auto-segmentation technique with the following threshold levels was used: quantitatively measured area of total thorax [QMAtotal: -910 to 1000 Hounsfield units (HU)], lung (QMAlung: -910 to -200 HU), fat (QMAfat: -200 to 0 HU), muscle (QMAmuscle: 0-300 HU), soft tissue (fat + muscle, QMAsoft tissue: -200 to 300 HU), bone (QMAbone: 300-1000 HU) and solid tissue (fat + muscle + bone, QMAsolid tissue: -200 to 1000 HU). The relationship between image noise and variable biometric parameters including QMA was analyzed, and the linear correlation coefficients were used as indicators of the strength of association. Among the variable biometric parameters, including BMI, QMAsolid tissue showed the highest correlation coefficient with image noise in all subjects (r = 0.804), males (r = 0.716), females (r = 0.889), the overweight (r = 0.556), and the non-overweight subgroups (r = 0.783). QMAsolid tissue can be used as a potential surrogate predictor of the image noise level in low tube voltage CCTA. PMID:26507324

  11. Causality and headache triggers

    PubMed Central

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  12. AMY trigger system

    SciTech Connect

    Sakai, Yoshihide

    1989-04-01

    A trigger system of the AMY detector at TRISTAN e{sup +}e{sup -} collider is described briefly. The system uses simple track segment and shower cluster counting scheme to classify events to be triggered. It has been operating successfully since 1987.

  13. Data Acquisition Systems

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Technology developed during a joint research program with Langley and Kinetic Systems Corporation led to Kinetic Systems' production of a high speed Computer Automated Measurement and Control (CAMAC) data acquisition system. The study, which involved the use of CAMAC equipment applied to flight simulation, significantly improved the company's technical capability and produced new applications. With Digital Equipment Corporation, Kinetic Systems is marketing the system to government and private companies for flight simulation, fusion research, turbine testing, steelmaking, etc.

  14. Effective rainfall: a significant parameter to improve understanding of deep-seated rainfall triggering landslide - a simple computation temperature based method applied to Séchilienne unstable slope (French Alps)

    NASA Astrophysics Data System (ADS)

    Vallet, A.; Bertrand, C.; Mudry, J.

    2013-07-01

    Pore water pressure, build up by recharge of hydrosystems, is one of the main triggering factors of deep seated landslides. Effective rainfall, which is the part of the rainfall which recharges the aquifer, is a significant parameter. Soil-water balance is an accurate way to estimate effective rainfall. Nevertheless this approach requires evapotranspiration, soil water storage and runoff characterization. Available soil storage and runoff were deduced from field observations whereas evapotranspiration computation is a highly demanding method requiring significant input of meteorological data. Most of the landslide sites used weather stations with limited datasets. A workflow method was developed to compute effective rainfall requiring only temperature and rainfall as inputs. Two solar radiation and five commonly used evapotranspiration equations were tested at Séchilienne. The method was developed to be as general as possible in order to be able to be applied to other landslides. This study demonstrated that, for the Séchilienne unstable slope, the displacement data correlation performance (coefficient of determination) is significantly enhanced with effective rainfall (0.633) compared to results obtained with raw rainfall (0.436) data. The proposed method for estimation of effective rainfall was developed to be sufficiently simple to be used by any non-hydro specialist who intends to characterize the relationship of rainfall to landslide displacements.

  15. Feature Acquisition with Imbalanced Training Data

    NASA Technical Reports Server (NTRS)

    Thompson, David R.; Wagstaff, Kiri L.; Majid, Walid A.; Jones, Dayton L.

    2011-01-01

    This work considers cost-sensitive feature acquisition that attempts to classify a candidate datapoint from incomplete information. In this task, an agent acquires features of the datapoint using one or more costly diagnostic tests, and eventually ascribes a classification label. A cost function describes both the penalties for feature acquisition, as well as misclassification errors. A common solution is a Cost Sensitive Decision Tree (CSDT), a branching sequence of tests with features acquired at interior decision points and class assignment at the leaves. CSDT's can incorporate a wide range of diagnostic tests and can reflect arbitrary cost structures. They are particularly useful for online applications due to their low computational overhead. In this innovation, CSDT's are applied to cost-sensitive feature acquisition where the goal is to recognize very rare or unique phenomena in real time. Example applications from this domain include four areas. In stream processing, one seeks unique events in a real time data stream that is too large to store. In fault protection, a system must adapt quickly to react to anticipated errors by triggering repair activities or follow- up diagnostics. With real-time sensor networks, one seeks to classify unique, new events as they occur. With observational sciences, a new generation of instrumentation seeks unique events through online analysis of large observational datasets. This work presents a solution based on transfer learning principles that permits principled CSDT learning while exploiting any prior knowledge of the designer to correct both between-class and withinclass imbalance. Training examples are adaptively reweighted based on a decomposition of the data attributes. The result is a new, nonparametric representation that matches the anticipated attribute distribution for the target events.

  16. Data Acquisition Systems

    NASA Technical Reports Server (NTRS)

    1994-01-01

    In the mid-1980s, Kinetic Systems and Langley Research Center determined that high speed CAMAC (Computer Automated Measurement and Control) data acquisition systems could significantly improve Langley's ARTS (Advanced Real Time Simulation) system. The ARTS system supports flight simulation R&D, and the CAMAC equipment allowed 32 high performance simulators to be controlled by centrally located host computers. This technology broadened Kinetic Systems' capabilities and led to several commercial applications. One of them is General Atomics' fusion research program. Kinetic Systems equipment allows tokamak data to be acquired four to 15 times more rapidly. Ford Motor company uses the same technology to control and monitor transmission testing facilities.

  17. DSPS in data acquisition

    SciTech Connect

    Kirsch, M.; Haeupke, T.; Oelschlaeger, R.; Struck, B.

    1997-12-31

    Off-the-shelf and customized DSP boards in different bus architectures are perfectly suited to act as building blocks for flexible and high performance data acquisition systems. Due to their architecture they can be used to enhance the performance of existing equipment as add ons, as state-of-the-art readout controllers, event builders, on-the-fly data formatters and higher level trigger processors. Applications covering the above mentioned fields with Motorolas 96002 HARC DSP in the DESY HERMES and H1 experiments, at the focal plane polarimeter at KVI and the NIST high flux neutron backscattering spectrometer will be presented. Future possibilities with VME, PCI and PMC boards based on Analog Devices SHARC DSP will be discussed. Systems on the base of Texas Instruments TMS320C6X promise to provide unprecedented performance.

  18. LHCb Topological Trigger Reoptimization

    NASA Astrophysics Data System (ADS)

    Likhomanenko, Tatiana; Ilten, Philip; Khairullin, Egor; Rogozhnikov, Alex; Ustyuzhanin, Andrey; Williams, Michael

    2015-12-01

    The main b-physics trigger algorithm used by the LHCb experiment is the so- called topological trigger. The topological trigger selects vertices which are a) detached from the primary proton-proton collision and b) compatible with coming from the decay of a b-hadron. In the LHC Run 1, this trigger, which utilized a custom boosted decision tree algorithm, selected a nearly 100% pure sample of b-hadrons with a typical efficiency of 60-70%; its output was used in about 60% of LHCb papers. This talk presents studies carried out to optimize the topological trigger for LHC Run 2. In particular, we have carried out a detailed comparison of various machine learning classifier algorithms, e.g., AdaBoost, MatrixNet and neural networks. The topological trigger algorithm is designed to select all ’interesting” decays of b-hadrons, but cannot be trained on every such decay. Studies have therefore been performed to determine how to optimize the performance of the classification algorithm on decays not used in the training. Methods studied include cascading, ensembling and blending techniques. Furthermore, novel boosting techniques have been implemented that will help reduce systematic uncertainties in Run 2 measurements. We demonstrate that the reoptimized topological trigger is expected to significantly improve on the Run 1 performance for a wide range of b-hadron decays.

  19. HERA-B higher-level triggers: architecture and software

    NASA Astrophysics Data System (ADS)

    Gellrich, Andreas; Medinnis, Mike

    1998-02-01

    HERA-B will be studying CP-violation in the B-system in a high-rate hadronic environment. To accomplish this goal, HERA-B needs a sophisticated data acquisition and trigger system. Except for the first level, all trigger levels are implemented as PC-farms, running the Unix-like operating system, Linux, thus blurring the sharp border between online and offline application software. The hardware architecture and software environments are discussed.

  20. Technical evaluation of different respiratory monitoring systems used for 4D CT acquisition under free breathing.

    PubMed

    Heinz, Christian; Reiner, Michael; Belka, Claus; Walter, Franziska; Söhn, Matthias

    2015-01-01

    Respiratory monitoring systems are required to supply CT scanners with information on the patient's breathing during the acquisition of a respiration-correlated computer tomography (RCCT), also referred to as 4D CT. The information a respiratory monitoring system has to provide to the CT scanner depends on the specific scanner. The purpose of this study is to compare two different respiratory monitoring systems (Anzai Respiratory Gating System; C-RAD Sentinel) with respect to their applicability in combination with an Aquilion Large Bore CT scanner from Toshiba. The scanner used in our clinic does not make use of the full time dependent breathing signal, but only single trigger pulses indicating the beginning of a new breathing cycle. Hence the attached respiratory monitoring system is expected to deliver accurate online trigger pulse for each breathing cycle. The accuracy of the trigger pulses sent to the CT scanner has to be ensured by the selected respiratory monitoring system. Since a trigger pulse (output signal) of a respiratory monitoring system is a function of the measured breathing signal (input signal), the typical clinical range of the input signal is estimated for both examined respiratory monitoring systems. Both systems are analyzed based on the following parameters: time resolution, signal amplitude, noise, signal-to-noise ratio (SNR), signal linearity, trigger compatibility, and clinical examples. The Anzai system shows a better SNR (≥ 28 dB) than the Sentinel system (≥ 14.6 dB). In terms of compatibility with the cycle-based image sorting algorithm of the Toshiba CT scanner, the Anzai system benefits from the possibility to generate cycle-based triggers, whereas the Sentinel system is only able to generate amplitude-based triggers. In clinical practice, the combination of a Toshiba CT scanner and the Anzai system will provide better results due to the compatibility of the image sorting and trigger release methods. PMID:26103168

  1. RECENT DEVELOPMENTS IN ALTERNATIVES TO CAMAC FOR DATA ACQUISITION AT DIII-D

    SciTech Connect

    KELLMAN,D.H; CAMPBELL,G.L; FERRON,J.R; PIGLOWSKI,D.A; AUSTIN,M.E; MCKEE,G.R

    2003-10-01

    OAK-B135 For over twenty years, data acquisition hardware at DIII-D has been based on the CAMAC platform. These rugged and reliable systems, however, are gradually becoming obsolete due to end-of-life issues, ever-decreasing industry support of older hardware, and the availability of modern alternative hardware with superior performance. Efforts are underway at DIII-D to adopt new data acquisition solutions which exploit modern technologies and surpass the limitations of the CAMAC standard. These efforts have involved the procurement and development of data acquisition systems based on the PCI and Compact-PCI platform standards. These systems are comprised of rack-mount computers containing data acquisition boards (digitizers), Ethernet connectivity, and the drivers and software necessary for control. Each digitizer contains analog-to-digital converters, control circuitry, firmware and memory to collect, store, and transfer waveform data acquired using internal or external triggers and clocks. Software has been developed which allows DIII-D computers to program the operational parameters of the digitizers, as well as to upload acquired data into the DIII-D acquisition database. All communication between host computers and the new acquisition systems occurs via standard Ethernet connections, a vast improvement over the slower, serial loop highways used for control and data transfer with CAMAC systems. In addition, the capabilities available in modern integrated and printed circuit manufacture result in digitizers with high channel count and memory density. Cost savings are also realized by utilizing a platform based on standards of the personal computer industry. Details of the new systems at DIII-D are presented, along with initial experience with their use, and plans for future expansion and improvement.

  2. Common Asthma Triggers

    MedlinePlus

    ... your bedding on the hottest water setting. Outdoor Air Pollution Outdoor air pollution can trigger an asthma attack. This pollution can ... your newspaper to plan your activities for when air pollution levels will be low. Cockroach Allergen Cockroaches and ...

  3. Dealing with Asthma Triggers

    MedlinePlus

    ... smell given off by paint or gas, and air pollution. If you notice that an irritant triggers your ... or other tobacco products around you. If outdoor air pollution is a problem, running the air conditioner or ...

  4. ELECTRONIC TRIGGER CIRCUIT

    DOEpatents

    Russell, J.A.G.

    1958-01-01

    An electronic trigger circuit is described of the type where an output pulse is obtained only after an input voltage has cqualed or exceeded a selected reference voltage. In general, the invention comprises a source of direct current reference voltage in series with an impedance and a diode rectifying element. An input pulse of preselected amplitude causes the diode to conduct and develop a signal across the impedance. The signal is delivered to an amplifier where an output pulse is produced and part of the output is fed back in a positive manner to the diode so that the amplifier produces a steep wave front trigger pulsc at the output. The trigger point of the described circuit is not subject to variation due to the aging, etc., of multi-electrode tabes, since the diode circuit essentially determines the trigger point.

  5. Dynamic Triggering Stress Modeling

    NASA Astrophysics Data System (ADS)

    Gonzalez-Huizar, H.; Velasco, A. A.

    2008-12-01

    It has been well established that static (permanent) stress changes can trigger nearby earthquakes, within a few fault lengths from the causative event, whereas triggering by dynamic (transient) stresses carried by seismic waves both nearby and at remote distances has not been as well documented nor understood. An analysis of the change in the local stress caused by the passing of surfaces waves is important for the understanding of this phenomenon. In this study, we modeled the change in the stress that the passing of Rayleigh and Loves waves causes on a fault plane of arbitrary orientation, and applied a Coulomb failure criteria to calculate the potential of these stress changes to trigger reverse, normal or strike-slip failure. We preliminarily test these model results with data from dynamically triggering earthquakes in the Australian Bowen Basin. In the Bowen region, the modeling predicts a maximum triggering potential for Rayleigh waves arriving perpendicularly to the strike of the reverse faults present in the region. The modeled potentials agree with our observations, and give us an understanding of the dynamic stress orientation needed to trigger different type of earthquakes.

  6. Data acquisition for the CDF SVX II upgrade

    SciTech Connect

    Gold, M.; CDF SVX II Group

    1994-09-01

    CDF is developing a second generation silicon vertex detector for the Fermilab Tevatron Run II. In order to exploit the high luminosity of the Fermilab Main Injector, the data acquisition system is being designed to accept high trigger rates and accommodate a secondary vertex trigger.

  7. Data acquisition instruments: Psychopharmacology

    SciTech Connect

    Hartley, D.S. III

    1998-01-01

    This report contains the results of a Direct Assistance Project performed by Lockheed Martin Energy Systems, Inc., for Dr. K. O. Jobson. The purpose of the project was to perform preliminary analysis of the data acquisition instruments used in the field of psychiatry, with the goal of identifying commonalities of data and strategies for handling and using the data in the most advantageous fashion. Data acquisition instruments from 12 sources were provided by Dr. Jobson. Several commonalities were identified and a potentially useful data strategy is reported here. Analysis of the information collected for utility in performing diagnoses is recommended. In addition, further work is recommended to refine the commonalities into a directly useful computer systems structure.

  8. An efficient workflow to accurately compute groundwater recharge for the study of rainfall-triggered deep-seated landslides, application to the Séchilienne unstable slope (western Alps)

    NASA Astrophysics Data System (ADS)

    Vallet, A.; Bertrand, C.; Fabbri, O.; Mudry, J.

    2015-01-01

    Pore water pressure build-up by recharge of underground hydrosystems is one of the main triggering factors of deep-seated landslides. In most deep-seated landslides, pore water pressure data are not available since piezometers, if any, have a very short lifespan because of slope movements. As a consequence, indirect parameters, such as the calculated recharge, are the only data which enable understanding landslide hydrodynamic behaviour. However, in landslide studies, methods and recharge-area parameters used to determine the groundwater recharge are rarely detailed. In this study, the groundwater recharge is estimated with a soil-water balance based on characterisation of evapotranspiration and parameters characterising the recharge area (soil available water capacity, runoff and vegetation coefficient). A workflow to compute daily groundwater recharge is developed. This workflow requires the records of precipitation, air temperature, relative humidity, solar radiation and wind speed within or close to the landslide area. The determination of the parameters of the recharge area is based on a spatial analysis requiring field observations and spatial data sets (digital elevation models, aerial photographs and geological maps). This study demonstrates that the performance of the correlation with landslide displacement velocity data is significantly improved using the recharge estimated with the proposed workflow. The coefficient of determination obtained with the recharge estimated with the proposed workflow is 78% higher on average than that obtained with precipitation, and is 38% higher on average than that obtained with recharge computed with a commonly used simplification in landslide studies (recharge = precipitation minus non-calibrated evapotranspiration method).

  9. Trigger mechanism for engines

    SciTech Connect

    Clark, L.R.

    1989-02-28

    A trigger mechanism is described for a blower-vacuum apparatus having a trigger mounted within a handle and a small engine comprising: a throttle; a ''L'' shaped lever having first and second legs mounted for rotation about an intermediate pivot within the handle when the trigger is depressed, interconnecting the trigger and the throttle, the second leg having first teeth defined therein, the lever further having idle, full throttle and stop positions; a normally raised latch means adapted to be rotated and axially depressed, the latch means having second teeth situated on a cam to engage the first teeth for holding the lever in an intermediate position between the idle and full throttle positions when the latch means is rotated. The latch means further are cam teeth into potential engagement with the lever teeth when the trigger is depressed, lever is biased to the stop position; and idle adjusting means means for intercepting the second leg for preventing the second leg from reaching the stop position when the latch means is raised.

  10. Cygnus Trigger System

    SciTech Connect

    G. Corrow, M. Hansen, D. Henderson, C. Mitton

    2008-02-01

    The Cygnus Dual Beam Radiographic Facility consists of two radiographic sources (Cygnus 1, Cygnus 2) each with a dose rating of 4 rads at 1 m, and a 1-mm diameter spot size. The electrical specifications are: 2.25 MV, 60 kA, 60 ns. This facility is located in an underground environment at the Nevada Test Site (NTS). These sources were developed as a primary diagnostic for subcritical tests, which are single-shot, high-value events. In such an application there is an emphasis on reliability and reproducibility. A robust, low-jitter trigger system is a key element for meeting these goals. The trigger system was developed with both commercial and project-specific equipment. In addition to the traditional functions of a trigger system there are novel features added to protect the investment of a high-value shot. Details of the trigger system, including elements designed specifically for a subcritical test application, will be presented. The individual electronic components have their nominal throughput, and when assembled have a system throughput with a measured range of jitter. The shot-to-shot jitter will be assessed both individually and in combination. Trigger reliability and reproducibility results will be presented for a substantial number of shots executed at the NTS.

  11. A high-speed transputer-based data acquisition system

    NASA Astrophysics Data System (ADS)

    Loureiro, C. F. M.; Santos, J. M. G. B.; Simões, J. B.; Correia, C. M. B. A.; Zilker, M.

    1996-01-01

    A 250 MHz 8-bit transputer-based data acquisition VME bus module is described. This module has been designed as the acquisition node of a transputer-based real-time processing and data reduction system for the reflectometry diagnostic in the ASDEX Upgrade tokamak experiment. The architecture of the board is detailed, emphasizing the advantages of using recently delivered devices, like fast synchronous FIFOs, in a mixed ECL/TTL data acquisition architecture. It is shown that the implemented architecture leads naturally to the implementation of hardware triggers that allow the acquisition channels to operate as stand-alone modules in a self-triggered, self-timed, data acquisition mode. The advantages of using transputers as local control and processing units are discussed. The use of the board in the reflectometry diagnostic and the general processing goals of the system are presented together with data characterizing the performance of the acquisition channels.

  12. Operation and modeling of the FORTE trigger box

    SciTech Connect

    Murphy, T.

    1996-06-01

    The fast on-orbit recording of transient events satellite (FORTE) will carry a multiple-narrow-band trigger designed to detect impulsive VHF signals embedded in a high-noise background. The FORTE trigger boxes consist of eight VHF channels spaced across twenty MHz of bandwidth. A trigger is generated when a sufficiently bright signal is seen in a user-defined number of these channels within a specified coincidence window. In addition, the trigger circuitry incorporates a feature to reject events caused by the actuation of narrow-band carriers. This report describes the trigger`s operating principles and their implementation in the satellite hardware. We then discuss a computer model which can be used to simulate the performance of the trigger circuit.

  13. Microfabricated triggered vacuum switch

    DOEpatents

    Roesler, Alexander W.; Schare, Joshua M.; Bunch, Kyle

    2010-05-11

    A microfabricated vacuum switch is disclosed which includes a substrate upon which an anode, cathode and trigger electrode are located. A cover is sealed over the substrate under vacuum to complete the vacuum switch. In some embodiments of the present invention, a metal cover can be used in place of the trigger electrode on the substrate. Materials used for the vacuum switch are compatible with high vacuum, relatively high temperature processing. These materials include molybdenum, niobium, copper, tungsten, aluminum and alloys thereof for the anode and cathode. Carbon in the form of graphitic carbon, a diamond-like material, or carbon nanotubes can be used in the trigger electrode. Channels can be optionally formed in the substrate to mitigate against surface breakdown.

  14. Video Event Trigger

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.; Lichter, Michael J.

    1994-01-01

    Video event trigger (VET) processes video image data to generate trigger signal when image shows significant change like motion or appearance, disappearance, change in color, change in brightness, or dilation of object. System aids in efficient utilization of image-data-storage and image-data-processing equipment in applications in which many video frames show no changes and are wasteful to record and analyze all frames when only relatively few frames show changes of interest. Applications include video recording of automobile crash tests, automated video monitoring of entrances, exits, parking lots, and secure areas.

  15. Upgrade of the trigger system of CMS

    NASA Astrophysics Data System (ADS)

    Jeitler, Manfred; CMS Collaboration

    2013-08-01

    Various parts of the CMS trigger and in particular the Level-1 hardware trigger will be upgraded to cope with increasing luminosity, using more selective trigger conditions at Level 1 and improving the reliability of the system. Many trigger subsystems use FPGAs (Field Programmable Gate Arrays) in the electronics and will benefit from developments in this technology, allowing us to place much more logic into a single FPGA chip, thus reducing the number of chips, electronic boards and interconnections and in this way improving reliability. A number of subsystems plan to switch from the old VME bus to the new microTCA crate standard. Using similar approaches, identical modules and common software wherever possible will reduce costs and manpower requirements and improve the serviceability of the whole trigger system. The computer-farm based High-Level Trigger will not only be extended by using increasing numbers of more powerful PCs but there are also concepts for making it more robust and the software easier to maintain, which will result in better efficiency of the whole system.

  16. Early object rule acquisition.

    PubMed

    Pierce, D E

    1991-05-01

    The purpose of this study was to generate a grounded theory of early object rule acquisition. The grounded theory approach and computer coding were used to analyze videotaped samples of an infant's and a toddler's independent object play, which produced the categories descriptive of three primary types of object rules; rules of object properties, rules of object action, and rules of object affect. This occupational science theory offers potential for understanding the role of objects in human occupations, for development of instruments, and for applications in occupational therapy early intervention. PMID:2048625

  17. New methods for trigger electronics development

    SciTech Connect

    Cleland, W.E.; Stern, E.G.

    1991-12-31

    The large and complex nature of RHIC experiments and the tight time schedule for their construction requires that new techniques for designing the electronics should be employed. This is particularly true of the trigger and data acquisition electronics which has to be ready for turn-on of the experiment. We describe the use of the Workview package from VIEWlogic Inc. for design, simulation, and verification of a flash ADC readout system. We also show how field-programmable gate arrays such as the Xilinx 4000 might be employed to construct or prototype circuits with a large number of gates while preserving flexibility.

  18. Data acquisition system for SLD

    SciTech Connect

    Sherden, D.J.

    1985-05-01

    This paper describes the data acquisition system planned for the SLD detector which is being constructed for use with the SLAC Linear Collider (SLC). An exclusively FASTBUS front-end system is used together with a VAX-based host system. While the volume of data transferred does not challenge the band-width capabilities of FASTBUS, extensive use is made of the parallel processing capabilities allowed by FASTBUS to reduce the data to a size which can be handled by the host system. The low repetition rate of the SLC allows a relatively simple software-based trigger. The principal components and overall architecture of the hardware and software are described.

  19. Disambiguating Syntactic Triggers

    ERIC Educational Resources Information Center

    Sakas, William Gregory; Fodor, Janet Dean

    2012-01-01

    We present data from an artificial language domain that suggest new contributions to the theory of syntactic triggers. Whether a learning algorithm is capable of matching the achievements of child learners depends in part on how much parametric ambiguity there is in the input. For practical reasons this cannot be established for the domain of all…

  20. Triggered plasma opening switch

    SciTech Connect

    Mendel, C W

    1988-02-23

    A triggerable opening switch for a very high voltage and current pulse includes a transmission line extending from a source to a load and having an intermediate switch section including a plasma for conducting electrons between transmission line conductors and a magnetic field for breaking the plasma conduction path and magnetically insulating the electrons when it is desired to open the switch.

  1. Triggered plasma opening switch

    DOEpatents

    Mendel, Clifford W.

    1988-01-01

    A triggerable opening switch for a very high voltage and current pulse includes a transmission line extending from a source to a load and having an intermediate switch section including a plasma for conducting electrons between transmission line conductors and a magnetic field for breaking the plasma conduction path and magnetically insulating the electrons when it is desired to open the switch.

  2. Acquisition and Applications of Three Microcomputers by the Department of Secondary Education. The Illinois Series on Educational Application of Computers. Number 29.

    ERIC Educational Resources Information Center

    Case, Jeff; And Others

    This report describes the experience of the Department of Secondary Education in acquiring and applying three microcomputers: MSEA, (Micro-Computer System for Educational Applications), North Star HORIZON, and VECTOR. MSEA was designed as a central facility that would be capable of providing all types of educational computing research and service,…

  3. Microcomputer data acquisition and control.

    PubMed

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence. PMID:3805859

  4. A programmable systolic trigger processor for FERA bus data

    NASA Astrophysics Data System (ADS)

    Appelquist, G.; Hovander, B.; Selldén, B.; Bohm, C.

    1993-04-01

    A generic CAMAC based trigger processor module for fast processing of large amounts of ADC data, has been designed. This module has been realised using complex programmable gate arrays (LCAs from XILINX). The gate arrays have been connected to memories and multipliers in such a way that different gate array configurations can cover a wide range of module applications. Using this module, it is possible to construct complex trigger processors. The module uses both the fast ECL FERA bus and the CAMAC bus for inputs and outputs. The latter, however, is primarily used for setup and control but may also be used for data output. Large numbers of ADCs can be served by a hierarchical arrangement of trigger processor modules, processing ADC data with pipe-line arithmetics producing the final result at the apex of the pyramid. The trigger decision will be transmitted to the data acquisition system via a logic signal while numeric results may be extracted by the CAMAC controller. The trigger processor was originally developed for the proposed neutral particle search experiment at CERN, NUMASS. There it was designed to serve as a second level trigger processor. It was required to correct all ADC raw data for efficiency and pedestal, calculate the total calorimeter energy, obtain the optimal time of flight data and calculate the particle mass. A suitable mass-cut would then deliver the trigger decision. More complex triggers were also considered.

  5. Use of GPUs in Trigger Systems

    NASA Astrophysics Data System (ADS)

    Lamanna, Gianluca

    In recent years the interest for using graphics processor (GPU) in general purpose high performance computing is constantly rising. In this paper we discuss the possible use of GPUs to construct a fast and effective real time trigger system, both in software and hardware levels. In particular, we study the integration of such a system in the NA62 trigger. The first application of GPUs for rings pattern recognition in the RICH will be presented. The results obtained show that there are not showstoppers in trigger systems with relatively low latency. Thanks to the use of off-the-shelf technology, in continous development for purposes related to video game and image processing market, the architecture described would be easily exported to other experiments, to build a versatile and fully customizable online selection.

  6. Optically triggered infrared photodetector.

    PubMed

    Ramiro, Íñigo; Martí, Antonio; Antolín, Elisa; López, Esther; Datas, Alejandro; Luque, Antonio; Ripalda, José M; González, Yolanda

    2015-01-14

    We demonstrate a new class of semiconductor device: the optically triggered infrared photodetector (OTIP). This photodetector is based on a new physical principle that allows the detection of infrared light to be switched ON and OFF by means of an external light. Our experimental device, fabricated using InAs/AlGaAs quantum-dot technology, demonstrates normal incidence infrared detection in the 2-6 μm range. The detection is optically triggered by a 590 nm light-emitting diode. Furthermore, the detection gain is achieved in our device without an increase of the noise level. The novel characteristics of OTIPs open up new possibilities for third generation infrared imaging systems ( Rogalski, A.; Antoszewski, J.; Faraone, L. J. Appl. Phys. 2009, 105 (9), 091101). PMID:25490236

  7. Syntax acquisition.

    PubMed

    Crain, Stephen; Thornton, Rosalind

    2012-03-01

    Every normal child acquires a language in just a few years. By 3- or 4-years-old, children have effectively become adults in their abilities to produce and understand endlessly many sentences in a variety of conversational contexts. There are two alternative accounts of the course of children's language development. These different perspectives can be traced back to the nature versus nurture debate about how knowledge is acquired in any cognitive domain. One perspective dates back to Plato's dialog 'The Meno'. In this dialog, the protagonist, Socrates, demonstrates to Meno, an aristocrat in Ancient Greece, that a young slave knows more about geometry than he could have learned from experience. By extension, Plato's Problem refers to any gap between experience and knowledge. How children fill in the gap in the case of language continues to be the subject of much controversy in cognitive science. Any model of language acquisition must address three factors, inter alia: 1. The knowledge children accrue; 2. The input children receive (often called the primary linguistic data); 3. The nonlinguistic capacities of children to form and test generalizations based on the input. According to the famous linguist Noam Chomsky, the main task of linguistics is to explain how children bridge the gap-Chomsky calls it a 'chasm'-between what they come to know about language, and what they could have learned from experience, even given optimistic assumptions about their cognitive abilities. Proponents of the alternative 'nurture' approach accuse nativists like Chomsky of overestimating the complexity of what children learn, underestimating the data children have to work with, and manifesting undue pessimism about children's abilities to extract information based on the input. The modern 'nurture' approach is often referred to as the usage-based account. We discuss the usage-based account first, and then the nativist account. After that, we report and discuss the findings of several

  8. GLAST's GBM Burst Trigger

    NASA Technical Reports Server (NTRS)

    Band, D.; Briggs, M.; Connaughton, V.; Kippen, M.; Preece, R.

    2003-01-01

    The GLAST Burst Monitor (GBM) will detect and localize bursts for the GLAST mission, and provide the spectral and temporal context in the traditional 10 keV to 25 MeV band for the high energy observations by the Large Area Telescope (LAT). The GBM will use traditional rate triggers in up to three energy bands, and on a variety of timescales between 16 ms and 16 s.

  9. GLAST's GBM Burst Trigger

    SciTech Connect

    Band, D.; Kippen, M.

    2004-09-28

    The GLAST Burst Monitor (GBM) will detect and localize bursts for the GLAST mission, and provide the spectral and temporal context in the traditional 10 keV to 25 MeV band for the high energy observations by the Large Area Telescope (LAT). The GBM will use traditional rate triggers in up to three energy bands, and on a variety of timescales between 16 ms and 16 s.

  10. Neural networks for triggering

    SciTech Connect

    Denby, B. ); Campbell, M. ); Bedeschi, F. ); Chriss, N.; Bowers, C. ); Nesti, F. )

    1990-01-01

    Two types of neural network beauty trigger architectures, based on identification of electrons in jets and recognition of secondary vertices, have been simulated in the environment of the Fermilab CDF experiment. The efficiencies for B's and rejection of background obtained are encouraging. If hardware tests are successful, the electron identification architecture will be tested in the 1991 run of CDF. 10 refs., 5 figs., 1 tab.

  11. Isolating Triggered Star Formation

    SciTech Connect

    Barton, Elizabeth J.; Arnold, Jacob A.; Zentner, Andrew R.; Bullock, James S.; Wechsler, Risa H.; /KIPAC, Menlo Park /SLAC

    2007-09-12

    Galaxy pairs provide a potentially powerful means of studying triggered star formation from galaxy interactions. We use a large cosmological N-body simulation coupled with a well-tested semi-analytic substructure model to demonstrate that the majority of galaxies in close pairs reside within cluster or group-size halos and therefore represent a biased population, poorly suited for direct comparison to 'field' galaxies. Thus, the frequent observation that some types of galaxies in pairs have redder colors than 'field' galaxies is primarily a selection effect. We use our simulations to devise a means to select galaxy pairs that are isolated in their dark matter halos with respect to other massive subhalos (N= 2 halos) and to select a control sample of isolated galaxies (N= 1 halos) for comparison. We then apply these selection criteria to a volume-limited subset of the 2dF Galaxy Redshift Survey with M{sub B,j} {le} -19 and obtain the first clean measure of the typical fraction of galaxies affected by triggered star formation and the average elevation in the star formation rate. We find that 24% (30.5 %) of these L* and sub-L* galaxies in isolated 50 (30) h{sup -1} kpc pairs exhibit star formation that is boosted by a factor of {approx}> 5 above their average past value, while only 10% of isolated galaxies in the control sample show this level of enhancement. Thus, 14% (20 %) of the galaxies in these close pairs show clear triggered star formation. Our orbit models suggest that 12% (16%) of 50 (30) h{sup -1} kpc close pairs that are isolated according to our definition have had a close ({le} 30 h{sup -1} kpc) pass within the last Gyr. Thus, the data are broadly consistent with a scenario in which most or all close passes of isolated pairs result in triggered star formation. The isolation criteria we develop provide a means to constrain star formation and feedback prescriptions in hydrodynamic simulations and a very general method of understanding the importance of

  12. The central trigger control system of the CMS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Jeitler, M.; Taurok, A.; Bergauer, H.; Kastner, K.; Mikulec, I.; Neuherz, B.; Padrta, M.; Sakulin, H.; Strauss, J.; Wulz, C.-E.

    2010-05-01

    The Level-1 (L1) Trigger of the CMS experiment uses custom-made, fast electronics, while the experiment's high-level trigger is implemented in computer farms. The Central Trigger Control System described in this poster receives physics triggers from the Global Trigger Logic unit, collects information from the various subdetector systems to check if they are ready to accept triggers, reduces excessive trigger rates according to preset rules and finally distributes the trigger ("Level-1 Accept") together with timing signals to the subdetectors over the so-called "Trigger, and Timing and Control" (TTC) network of the experiment. The complete functionality of the Central Trigger Control System is implemented in one 9U-VME module and several ancillary boards for input and output functions. The system has been used successfully during CMS test runs with cosmics and beam.

  13. Are Educational Computer Micro-Games Engaging and Effective for Knowledge Acquisition at High-Schools? A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Brom, Cyril; Preuss, Michal; Klement, Daniel

    2011-01-01

    Curricular schooling can benefit from the usage of educational computer games, but it is difficult to integrate them in the formal schooling system. Here, we investigate one possible approach to this integration, which capitalizes on using a micro-game that can be played with a teacher's guidance as a supplement after a traditional expository…

  14. Effects of Computer-Based Video Instruction on the Acquisition and Generalization of Grocery Purchasing Skills for Students with Intellectual Disability

    ERIC Educational Resources Information Center

    Goo, Minkowan; Therrien, William J.; Hua, Youjia

    2016-01-01

    The purpose of this study was to evaluate the effects of computer-based video instruction (CBVI) on teaching grocery purchasing skills to students with moderate intellectual disability (ID). Four high school students with mild to moderate ID participated in the study. A multiple-probe design across students was used to examine the effects. Results…

  15. Triggered Codeswitching between Cognate Languages

    ERIC Educational Resources Information Center

    Broersma, Mirjam

    2009-01-01

    This study shows further evidence for triggered codeswitching. In natural speech from a Dutch-English bilingual, codeswitches occurred more often directly next to a cognate (or "trigger word") than elsewhere. This evidence from typologically related, cognate languages extends previous evidence for triggering between typologically unrelated…

  16. The ALICE high-level trigger read-out upgrade for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Engel, H.; Alt, T.; Breitner, T.; Gomez Ramirez, A.; Kollegger, T.; Krzewicki, M.; Lehrbach, J.; Rohr, D.; Kebschull, U.

    2016-01-01

    The ALICE experiment uses an optical read-out protocol called Detector Data Link (DDL) to connect the detectors with the computing clusters of Data Acquisition (DAQ) and High-Level Trigger (HLT). The interfaces of the clusters to these optical links are realized with FPGA-based PCI-Express boards. The High-Level Trigger is a computing cluster dedicated to the online reconstruction and compression of experimental data. It uses a combination of CPU, GPU and FPGA processing. For Run 2, the HLT has replaced all of its previous interface boards with the Common Read-Out Receiver Card (C-RORC) to enable read-out of detectors at high link rates and to extend the pre-processing capabilities of the cluster. The new hardware also comes with an increased link density that reduces the number of boards required. A modular firmware approach allows different processing and transport tasks to be built from the same source tree. A hardware pre-processing core includes cluster finding already in the C-RORC firmware. State of the art interfaces and memory allocation schemes enable a transparent integration of the C-RORC into the existing HLT software infrastructure. Common cluster management and monitoring frameworks are used to also handle C-RORC metrics. The C-RORC is in use in the clusters of ALICE DAQ and HLT since the start of LHC Run 2.

  17. The data acquisition system for SLD

    SciTech Connect

    Sherden, D.J.

    1986-10-01

    This paper describes the data acquisition system planned for the SLD detector, which is being constructed for use with the SLAC Linear Collider (SLC). Analog electronics, heavily incorporating hybrid and custom VLSI circuitry, is mounted on the detector itself. Extensive use is made of multiplexing through optical fibers to a FASTBUS readout system. The low repetition rate of the SLC allows a relatively simple software-based trigger. Hardware and software processors within the acquisition modules are used to reduce the large volume of data per event and to calibrate the electronics. A farm of microprocessors is used for full reconstruction of a sample of events prior to transmission to the host.

  18. The data acquisition system for SLD

    SciTech Connect

    Sherden, D.J.

    1987-04-01

    This paper describes the data acquisition system planned for the SLD detector, which is being constructed for use with the SLAC Linear Collider (SLC). Analog electronics, heavily incorporating hybrid and custom VLSI circuitry, is mounted on the detector itself. Extensive use is made of multiplexing through optical fibers to a FASTBUS readout system. The low repetition rate of the SLC allows a relatively simple software-based trigger. Hardware and software processors within the acquisition modules are used to reduce the large volume of data per event and to calibrate the electronics. A farm of microprocessors is used for full reconstruction of a sample of events prior to transmission to the host.

  19. Investigating Second Language Acquisition.

    ERIC Educational Resources Information Center

    Jordens, Peter, Ed.; Lalleman, Josine, Ed.

    Essays in second language acquisition include: "The State of the Art in Second Language Acquisition Research" (Josine Lalleman); "Crosslinguistic Influence with Special Reference to the Acquisition of Grammar" (Michael Sharwood Smith); "Second Language Acquisition by Adult Immigrants: A Multiple Case Study of Turkish and Moroccan Learners of…

  20. Preliminary on-orbit results of trigger system for DAMPE

    NASA Astrophysics Data System (ADS)

    Zhang, Yongqiang; Chang, Jin; Guo, Jian hua; Dong, TieKuang; Liu, Yang

    2016-07-01

    The Dark Matter Particle Explorer (DAMPE), Chinese first high energy cosmic ray explorer in space, has been successfully launched at Jiuquan Satellite Launch Center, with the mission of searching dark matter particle. Large energy range for electron/gamma, good energy resolution, and excellent PID ability, make DAMPE to be the most promising detector so far to find the signal of dark matter. DAMPE consists of four sub-detectors: Plastic Scintillation detector, Silicon-Tungsten tracker, BGO calorimeter and Neutron detector. The hit signals generated by the BGO calorimeter and the trigger board (in DAQ) constitute the trigger system of DAMPE, which will generate trigger signals for the four sub-detectors to start data acquisition. The trigger system reduces the trigger rates on orbit from about 1kHz to 70~100Hz, that releases the stress of DAQ transmitting data to ground. In this paper, we will introduce the trigger system of DAMPE, and present some preliminary on-orbit results e.g. trigger efficiency, together with the beam test results at CERN and the simulation results as comparison.

  1. A programmable systolic trigger processor for FERA-bus data

    NASA Astrophysics Data System (ADS)

    Appelquist, G.; Hovander, B.; Sellden, B.; Bohm, C.

    1992-09-01

    A generic CAMAC based trigger processor module for fast processing of large amounts of Analog to Digital Converter (ADC) data was designed. This module was realized using complex programmable gate arrays. The gate arrays were connected to memories and multipliers in such a way that different gate array configurations can cover a wide range of module applications. Using this module, it is possible to construct complex trigger processors. The module uses both the fast ECL FERA bus and the CAMAC bus for inputs and outputs. The latter is used for set up and control but may also be used for data output. Large numbers of ADC's can be served by a hierarchical arrangement of trigger processor modules which process ADC data with pipeline arithmetics and produce the final result at the apex of the pyramid. The trigger decision is transmitted to the data acquisition system via a logic signal while numeric results may be extracted by the CAMAC controller. The trigger processor was developed for the proposed neutral particle search. It was designed to serve as a second level trigger processor. It was required to correct all ADC raw data for efficiency and pedestal, calculate the total calorimeter energy, obtain the optimal time of flight data, and calculate the particle mass. A suitable mass cut would then deliver the trigger decision.

  2. Subnanosecond trigger system for ETA

    SciTech Connect

    Cook, E.G.; Lauer, E.J.; Reginato, L.L.; Rogers D.; Schmidt, J.A.

    1980-05-30

    A high-voltage trigger system capable of triggering 30, 250 kV spark gaps; each with less than +- 1 ns jitter has been constructed. In addition to low jitter rates, the trigger system must be capable of delivering the high voltage pulses to the spark gaps either simultaneously or sequentially as determined by other system requirements. The trigger system consists of several stages of pulse amplification culminating in 160 kV pulses having 30 ns risetime. The trigger system is described and test data provided.

  3. Data-acquisition systems

    SciTech Connect

    Cyborski, D.R.; Teh, K.M.

    1995-08-01

    Up to now, DAPHNE, the data-acquisition system developed for ATLAS, was used routinely for experiments at ATLAS and the Dynamitron. More recently, the Division implemented 2 MSU/DAPHNE systems. The MSU/DAPHNE system is a hybrid data-acquisition system which combines the front-end of the Michigan State University (MSU) DA system with the traditional DAPHNE back-end. The MSU front-end is based on commercially available modules. This alleviates the problems encountered with the DAPHNE front-end which is based on custom designed electronics. The first MSU system was obtained for the APEX experiment and was used there successfully. A second MSU front-end, purchased as a backup for the APEX experiment, was installed as a fully-independent second MSU/DAPHNE system with the procurement of a DEC 3000 Alpha host computer, and was used successfully for data-taking in an experiment at ATLAS. Additional hardware for a third system was bought and will be installed. With the availability of 2 MSU/DAPHNE systems in addition to the existing APEX setup, it is planned that the existing DAPHNE front-end will be decommissioned.

  4. Portable data acquisition system

    SciTech Connect

    Bowers, J; Rogers, H

    1999-05-03

    Lawrence Livermore National Laboratory (LLNL) has developed a Portable Data Acquisition (DAQ) System that is basically a laboratory-scale of Program Logic Control (PLC). This DAQ system can obtain signals from numerous sensors (e.g., pH, level, pressure, flow meters), open and close valves, and turn on and off pumps. The data can then be saved on a spreadsheet or displayed as a graph/indicator in real-time on a computer screen. The whole DAQ system was designed to be portable so that it could sit on a bench top during laboratory-scale treatability studies, or moved out into the field during larger studies. This DAQ system is also fairly simple to use. All that is required is some working knowledge of LabVIEW 4.1, and how to properly wire the process equipment. The DAQ system has been used during treatability studies on cesium precipitation, controlled hydrolysis of water- reactive wastes, and other waste treatment studies that enable LLNL to comply with the Federal Facility Compliance Act (FFCAct). Improved data acquisition allows the study to be better monitored, and therefore better controlled, and can be used to determine the results of the treatment study more effectively. This also contributes to the design of larger treatment processes.

  5. Design of the Trigger Interface and Distribution Board for CEBAF 12 GeV Upgrade

    SciTech Connect

    Gu, Jianhui; Dong, Hai; Cuevas, R; Gyurjyan, Vardan; Heyes, William; Jastrzembski, Edward; Kaneta, Scott; Nganga, Nicholas; Moffit, Bryan; Raydo, Benjamin; Timmer, Carl

    2012-10-01

    The design of the Trigger Interface and Distribution (TID) board for the 12 GeV Upgrade at the Continuous Electron Beam Accelerator Facility (CEBAF) at TJNAL is described. The TID board distributes a low jitter system clock, synchronized trigger, and synchronized multi-purpose SYNC signal. The TID also initiates data acquisition for the crate. With the TID boards, a multi-crate system can be setup for experiment test and commissioning. The TID board can be selectively populated as a Trigger Interface (TI) board, or a Trigger Distribution (TD) board for the 12 GeV upgrade experiments. When the TID is populated as a TI, it can be located in the VXS crate and distribute the CLOCK/TRIGGER/SYNC through the VXS P0 connector; it can also be located in the standard VME64 crate, and distribute the CLOCK/TRIGGER/SYNC through the VME P2 connector or front panel. It initiates the data acquisition for the front crate where the TI is positioned in. When the TID is populated as a TD, it fans out the CLOCK/TRIGGER/SYNC from trigger supervisor to the front end crates through optical fibres. The TD monitors the trigger processing on the TIs, and gives feedback to the TS for trigger flow control. Field Programmable Gate Arrays (FPGA) is utilised on TID board to provide programmability. The TID boards were intensively tested on the bench, and various setups.

  6. Protons Trigger Mitochondrial Flashes.

    PubMed

    Wang, Xianhua; Zhang, Xing; Huang, Zhanglong; Wu, Di; Liu, Beibei; Zhang, Rufeng; Yin, Rongkang; Hou, Tingting; Jian, Chongshu; Xu, Jiejia; Zhao, Yan; Wang, Yanru; Gao, Feng; Cheng, Heping

    2016-07-26

    Emerging evidence indicates that mitochondrial flashes (mitoflashes) are highly conserved elemental mitochondrial signaling events. However, which signal controls their ignition and how they are integrated with other mitochondrial signals and functions remain elusive. In this study, we aimed to further delineate the signal components of the mitoflash and determine the mitoflash trigger mechanism. Using multiple biosensors and chemical probes as well as label-free autofluorescence, we found that the mitoflash reflects chemical and electrical excitation at the single-organelle level, comprising bursting superoxide production, oxidative redox shift, and matrix alkalinization as well as transient membrane depolarization. Both electroneutral H(+)/K(+) or H(+)/Na(+) antiport and matrix proton uncaging elicited immediate and robust mitoflash responses over a broad dynamic range in cardiomyocytes and HeLa cells. However, charge-uncompensated proton transport, which depolarizes mitochondria, caused the opposite effect, and steady matrix acidification mildly inhibited mitoflashes. Based on a numerical simulation, we estimated a mean proton lifetime of 1.42 ns and diffusion distance of 2.06 nm in the matrix. We conclude that nanodomain protons act as a novel, to our knowledge, trigger of mitoflashes in energized mitochondria. This finding suggests that mitoflash genesis is functionally and mechanistically integrated with mitochondrial energy metabolism. PMID:27463140

  7. Triggering of Aftershocks by Free Oscillations

    NASA Astrophysics Data System (ADS)

    Bufe, C. G.; Varnes, D. J.

    2001-12-01

    Periodicities observed in aftershock sequences may result from earthquake triggering by free oscillations of the Earth produced by the main shock. Using an algorithm we developed to compute spectra of inter-event times, we examine inter-event intervals of teleseismically recorded aftershock sequences from large (M>7.5) main shocks that occurred during 1980-2001. Observed periodicities may result from triggering at intervals that are multiples of normal mode periods. We have focussed our analysis of inter-event times on identification of triggering by free oscillations at periods in the range 6-60 minutes. In this paper we describe our most commonly observed aftershock inter-event times and the free oscillation modes most likely to be the triggers. Because of their separation, the longer period modes are easiest to identify in the aftershock data (0S2 at 53.9 minutes, 0S3 at 35.6 minutes, 0S4 at 25.8 minutes, and 0T2 at 43.9 minutes). Evidence of triggering by 0S2 and 0T2 was also found in the aftershocks of the 1989 Loma Prieta, CA (M 7) earthquake (Kamal and Mansinha, 1996). Because of the plethora of higher modes, shorter inter-event periods are more difficult to identify with a particular mode. Preliminary analysis of the 2001 Bhuj, India (M 7.7) earthquake sequence tentatively identifies a contribution to triggering of the first four large aftershocks by multiples of 0S12 (8.37 minutes).

  8. The trigger and DAQ system for the NA62 experiment

    NASA Astrophysics Data System (ADS)

    Avanzini, C.; Collazuol, G.; Galeotti, S.; Imbergamo, E.; Lamanna, G.; Magazzù, G.; Ruggiero, G.; Sozzi, M.; Venditti, S.; NA62 Collaboration

    2010-11-01

    The main goal of the NA62 experiment is to measure the branching ratio of the K+→π+νν¯ decay, collecting O(1 0 0) events in two years of data taking. Efficient online selection of interesting events and loss-less readout at high rate will be key issues for such experiment. An integrated trigger and data acquisition system has been designed. Only the very first trigger stage will be implemented in hardware, in order to reduce the total rate for the software levels on PC farms. Readout uniformity among different subdetectors and scalability were taken into account in the architecture design.

  9. Two Demonstrations with a New Data-Acquisition System

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2014-01-01

    Nowadays, the use of data-acquisition systems in undergraduate laboratories is routine. Many computer-assisted experiments became possible with the PASCO scientific data-acquisition system based on the 750 Interface and DataStudio software. A new data-acquisition system developed by PASCO includes the 850 Universal Interface and Capstone software.…

  10. Digital trigger system for the RED-100 detector based on the unit in VME standard

    NASA Astrophysics Data System (ADS)

    Akimov, D. Yu; Belov, V. A.; Bolozdynya, A. I.; Efremenko, Yu V.; Kaplin, V. A.; Naumov, P. P.

    2016-02-01

    The system for forming a trigger for the RED-100 liquid xenon detector has been developed. The trigger can be generated for all types of events required to calibrate the detector and data acquisition, including events with one ionization electron. The system has an event detection mechanism where each event is assigned with the timestamp and event type. The trigger system is required in the systems searching for rare events to keep only the necessary information from the ADC array. The characteristics and implementation of the trigger system that provides high efficiency operation even at low-energy events have been described.

  11. Effector triggered immunity

    PubMed Central

    Rajamuthiah, Rajmohan; Mylonakis, Eleftherios

    2014-01-01

    Pathogenic bacteria produce virulence factors called effectors, which are important components of the infection process. Effectors aid in pathogenesis by facilitating bacterial attachment, pathogen entry into or exit from the host cell, immunoevasion, and immunosuppression. Effectors also have the ability to subvert host cellular processes, such as hijacking cytoskeletal machinery or blocking protein translation. However, host cells possess an evolutionarily conserved innate immune response that can sense the pathogen through the activity of its effectors and mount a robust immune response. This “effector triggered immunity” (ETI) was first discovered in plants but recent evidence suggest that the process is also well conserved in metazoans. We will discuss salient points of the mechanism of ETI in metazoans from recent studies done in mammalian cells and invertebrate model hosts. PMID:25513770

  12. Language Acquisition without an Acquisition Device

    ERIC Educational Resources Information Center

    O'Grady, William

    2012-01-01

    Most explanatory work on first and second language learning assumes the primacy of the acquisition phenomenon itself, and a good deal of work has been devoted to the search for an "acquisition device" that is specific to humans, and perhaps even to language. I will consider the possibility that this strategy is misguided and that language…

  13. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  14. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  15. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  16. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  17. 48 CFR 12.212 - Computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computer software. 12.212... ACQUISITION OF COMMERCIAL ITEMS Special Requirements for the Acquisition of Commercial Items 12.212 Computer software. (a) Commercial computer software or commercial computer software documentation shall be...

  18. Grammatical Acquisition: Inductive Bias and Coevolution of Language and the Language Acquisition Device.

    ERIC Educational Resources Information Center

    Briscoe, Ted

    2000-01-01

    An account of grammatical acquisition is developed within the parameter setting framework applied to a generalized categorical grammar (GCG). Computational simulation shows that several resulting acquisition procedures are effective on a parameter set expressing major typological distinctions based on constituent order, and defining 70 distinct…

  19. Performance of the CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Perrotta, Andrea

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved tracking and vertexing algorithms, discussing their impact on the b-tagging performance as well as on the jet and missing energy reconstruction.

  20. Event Reconstruction Algorithms for the ATLAS Trigger

    SciTech Connect

    Fonseca-Martin, T.; Abolins, M.; Adragna, P.; Aleksandrov, E.; Aleksandrov, I.; Amorim, A.; Anderson, K.; Anduaga, X.; Aracena, I.; Asquith, L.; Avolio, G.; Backlund, S.; Badescu, E.; Baines, J.; Barria, P.; Bartoldus, R.; Batreanu, S.; Beck, H.P.; Bee, C.; Bell, P.; Bell, W.H.; /more authors..

    2011-11-09

    The ATLAS experiment under construction at CERN is due to begin operation at the end of 2007. The detector will record the results of proton-proton collisions at a center-of-mass energy of 14 TeV. The trigger is a three-tier system designed to identify in real-time potentially interesting events that are then saved for detailed offline analysis. The trigger system will select approximately 200 Hz of potentially interesting events out of the 40 MHz bunch-crossing rate (with 10{sup 9} interactions per second at the nominal luminosity). Algorithms used in the trigger system to identify different event features of interest will be described, as well as their expected performance in terms of selection efficiency, background rejection and computation time per event. The talk will concentrate on recent improvements and on performance studies, using a very detailed simulation of the ATLAS detector and electronics chain that emulates the raw data as it will appear at the input to the trigger system.

  1. A novel time stamping technique for distributed data acquisition systems.

    PubMed

    Subramaniam, E T

    2012-12-01

    In this paper, we discuss the design and implementation of a synchronizing technique for data acquisition systems, which can effectively use the normal, standard local area network cables to provide a time stamp, with a range up to 32 days, resolution of 10 ns, and synchronization within ± 5 ns. This system may be used to synchronize data being collected by independent heterogeneous data acquisition modules, that acquire events independently. Such distributed systems are generally designed with a tree-like structure or independent self-triggered acquisition boxes. These leaf edges are connected through branches to the root node, via non-bus based inter-connecting links. The present system has been tested with a set of self-triggered digital signal processing based data acquisition engines, having a 100 MHz analog to digital converter front end. PMID:23277988

  2. Controls and data acquisition on Atlas

    SciTech Connect

    Scudder, D.W.; Hosack, K.W.; Parsons, W.M.; Reass, W.A.; Thompson, M.C.; Wysocki, F.J.; Creager, J.

    1997-09-01

    The control and data acquisition systems for Atlas will use a large degree of decentralization. By distributing control points close to the systems being controlled, the authors expect to simplify the task of isolating electronic systems from the large expected EMI pulses, allow connection of the various parts of the system by high-level fiber-optic networks, allow a simple configuration of the control and data acquisition screen rooms, and simplify the software efforts through the resulting modularization. The Atlas control system must control capacitor charging, machine and diagnostic timing and triggering, marx module diagnostics, vacuum systems, gas handling for railgaps, safety interlocks, and oil handling. Many of these tasks will be performed by industrial-style programmable logic controllers (PLCs). Each of 38 Marx bank maintenance units will have a control and diagnostic package which will monitor both charging and discharging current and railgap trigger timing. An unusual feature of digitizers to record each Marx module`s output waveform, plus nanosecond resolution time interval meters to record the firing time of each railgap. The machine data acquisition system for Atlas will be built around an SQL database, use National Instruments LabVIEW software to control data acquisition instruments and provide links for a variety of experimentalists` data analysis packages. World Wide Web access will provide an interface through which users can monitor experimental data and machine status.

  3. Triggering with the LHCb calorimeters

    NASA Astrophysics Data System (ADS)

    Lefevre, Regis; LHCb Collaboration

    2009-04-01

    The LHCb experiment at the LHC has been conceived to pursue high precision studies of CP violation and rare phenomena in b hadron decays. The online selection is crucial in LHCb and relies on the calorimeters to trigger on high transverse energy electrons, photons, π0 and hadrons. In this purpose a dedicated electronic has been realized. The calorimeter trigger system has been commissioned and is used to trigger on cosmic muons before beams start circulating in the LHC. When the LHC will start, it will also provide a very useful interaction trigger.

  4. Comparison of 180° and 360° Arc Data Acquisition to Measure Scintigraphic Parameters from Gated Single Photon Emission Computed Tomography Myocardial Perfusion Imaging: Is There Any Difference?

    PubMed Central

    Javadi, Hamid; Mahmoud-Pashazadeh, Ali; Mogharrabi, Mehdi; Iranpour, Darioush; Amini, Abdollatif; Pourbehi, Mohammadreza; Akbarzadeh, Mehdi; Nabipour, Iraj; Assadi, Majid

    2016-01-01

    Objective: The aim of the current study was to compare 180° and 360° data collection modes to measure end diastolic volume (EDV), end systolic volume (ESV) and ejection fraction (EF) values of the cardiac system by gated myocardial perfusion tomography. Methods: Thirty-three patients underwent gated myocardial perfusion tomography. Single photon emission computed tomography data of patients’ heart were acquired by 180°, 45° left posterior oblique to 45° right anterior oblique, and 360° to obtain EDV, ESV, EF and cardiac volume changes (V1, V2, V3, V4, V5, V6, V7 and V8) throughout each cardiac cycle. Results: Results of the current study indicated that there were no significant differences between 180° and 360° angular sampling in terms of measuring EDV, ESV and EF in myocardial perfusion imaging. Cardiac volume change patterns during a cardiac cycle were also similar in 360° and 180° scans. We also observed that there was no difference in EDV, ESV and EF values between the group with stress induced by exercise and the group with stress imposed by dipyridamole. Conclusion: As there is no difference between 180°and 360° cardiac scanning in terms of EDV, ESV and EF, half-orbit scan is recommended to study these cardiac system parameters because it offers more comfort to patients and a shorter scanning time. PMID:27299285

  5. EARLY SYNTACTIC ACQUISITION.

    ERIC Educational Resources Information Center

    KELLEY, K.L.

    THIS PAPER IS A STUDY OF A CHILD'S EARLIEST PRETRANSFORMATIONAL LANGUAGE ACQUISITION PROCESSES. A MODEL IS CONSTRUCTED BASED ON THE ASSUMPTIONS (1) THAT SYNTACTIC ACQUISITION OCCURS THROUGH THE TESTING OF HYPOTHESES REFLECTING THE INITIAL STRUCTURE OF THE ACQUISITION MECHANISM AND THE LANGUAGE DATA TO WHICH THE CHILD IS EXPOSED, AND (2) THAT…

  6. Angiographic imaging using an 18.9 MHz swept-wavelength laser that is phase-locked to the data acquisition clock and resonant scanners (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Tozburun, Serhat; Blatter, Cedric; Siddiqui, Meena; Nam, Ahhyun S.; Vakoc, Benjamin J.

    2016-03-01

    In this study, we present an angiographic system comprised from a novel 18.9 MHz swept wavelength source integrated with a MEMs-based 23.7 kHz fast-axis scanner. The system provides rapid acquisition of frames and volumes on which a range of Doppler and intensity-based angiographic analyses can be performed. Interestingly, the source and data acquisition computer can be directly phase-locked to provide an intrinsically phase stable imaging system supporting Doppler measurements without the need for individual A-line triggers or post-processing phase calibration algorithms. The system is integrated with a 1.8 Gigasample (GS) per second acquisition card supporting continuous acquisition to computer RAM for 10 seconds. Using this system, we demonstrate phase-stable acquisitions across volumes acquired at 60 Hz frequency. We also highlight the ability to perform c-mode angiography providing volume perfusion measurements with 30 Hz temporal resolution. Ultimately, the speed and phase-stability of this laser and MEMs scanner platform can be leveraged to accelerate OCT-based angiography and both phase-sensitive and phase-insensitive extraction of blood flow velocity.

  7. Triggering of repeated earthquakes

    NASA Astrophysics Data System (ADS)

    Sobolev, G. A.; Zakrzhevskaya, N. A.; Sobolev, D. G.

    2016-03-01

    Based on the analysis of the world's earthquakes with magnitudes M ≥ 6.5 for 1960-2013, it is shown that they cause global-scale coherent seismic oscillations which most distinctly manifest themselves in the period interval of 4-6 min during 1-3 days after the event. After these earthquakes, a repeated shock has an increased probability to occur in different seismically active regions located as far away as a few thousand km from the previous event, i.e., a remote interaction of seismic events takes place. The number of the repeated shocks N( t) decreases with time, which characterizes the memory of the lithosphere about the impact that has occurred. The time decay N( t) can be approximated by the linear, exponential, and powerlaw dependences. No distinct correlation between the spatial locations of the initial and repeated earthquakes is revealed. The probable triggering mechanisms of the remote interaction between the earthquakes are discussed. Surface seismic waves traveling several times around the Earth's, coherent oscillations, and global source are the most preferable candidates. This may lead to the accumulation and coalescence of ruptures in the highly stressed or weakened domains of a seismically active region, which increases the probability of a repeated earthquake.

  8. VLSI-based Video Event Triggering for Image Data Compression

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    1994-01-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  9. VLSI-based video event triggering for image data compression

    NASA Astrophysics Data System (ADS)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  10. Integration and reuse in cognitive skill acquisition.

    PubMed

    Salvucci, Dario D

    2013-07-01

    Previous accounts of cognitive skill acquisition have demonstrated how procedural knowledge can be obtained and transformed over time into skilled task performance. This article focuses on a complementary aspect of skill acquisition, namely the integration and reuse of previously known component skills. The article posits that, in addition to mechanisms that proceduralize knowledge into more efficient forms, skill acquisition requires tight integration of newly acquired knowledge and previously learned knowledge. Skill acquisition also benefits from reuse of existing knowledge across disparate task domains, relying on indexicals to reference and share necessary information across knowledge components. To demonstrate these ideas, the article proposes a computational model of skill acquisition from instructions focused on integration and reuse, and applies this model to account for behavior across seven task domains. PMID:23551386

  11. Fermi GBM Early Trigger Characteristics

    SciTech Connect

    Connaughton, Valerie; Briggs, Michael; Paciesas, Bill; Meegan, Charles

    2009-05-25

    Since the launch of the Fermi observatory on June 11 2008, the Gamma-ray Burst Monitor (GBM) has seen approximately 250 triggers of which about 150 were cosmic gamma-ray bursts (GRBs). GBM operates dozens of trigger algorithms covering various energy bands and timescales and is therefore sensitive to a wide variety of phenomena, both astrophysical and not.

  12. Peripheral electrical stimulation triggered by self-paced detection of motor intention enhances motor evoked potentials.

    PubMed

    Niazi, Imran Khan; Mrachacz-Kersting, Natalie; Jiang, Ning; Dremstrup, Kim; Farina, Dario

    2012-07-01

    This paper proposes the development and experimental tests of a self-paced asynchronous brain-computer interfacing (BCI) system that detects movement related cortical potentials (MRCPs) produced during motor imagination of ankle dorsiflexion and triggers peripheral electrical stimulations timed with the occurrence of MRCPs to induce corticospinal plasticity. MRCPs were detected online from EEG signals in eight healthy subjects with a true positive rate (TPR) of 67.15 ± 7.87% and false positive rate (FPR) of 22.05 ±9.07%. The excitability of the cortical projection to the target muscle (tibialis anterior) was assessed before and after the intervention through motor evoked potentials (MEP) using transcranial magnetic stimulation (TMS). The peak of the evoked potential significantly (P=0.02) increased after the BCI intervention by 53 ± 43% (relative to preintervention measure), although the spinal excitability (tested by stretch reflexes) did not change. These results demonstrate for the first time that it is possible to alter the corticospinal projections to the tibialis anterior muscle by using an asynchronous BCI system based on online motor imagination that triggered peripheral stimulation. This type of repetitive proprioceptive feedback training based on self-generated brain signal decoding may be a requirement for purposeful skill acquisition in intact humans and in the rehabilitation of persons with brain damage. PMID:22547461

  13. Triggering requirements for SSC physics

    SciTech Connect

    Gilchriese, M.G.D.

    1989-04-01

    Some aspects of triggering requirements for high P{sub T} physics processes at the Superconducting Super Collider (SSC) are described. A very wide range of trigger types will be required to enable detection of the large number of potential physics signatures possible at the SSC. Although in many cases trigger rates are not now well understood, it is possible to conclude that the ability to trigger on transverse energy, number and energy of jets, number and energy of leptons (electrons and muons), missing energy and combinations of these will be required. An SSC trigger system must be both highly flexible and redundant to ensure reliable detection of many new physics processes at the SSC.

  14. TFTR diagnostic control and data acquisition system

    NASA Astrophysics Data System (ADS)

    Sauthoff, N. R.; Daniels, R. E.

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man-machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ``groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  15. TFTR diagnostic control and data acquisition system

    SciTech Connect

    Sauthoff, N.R.; Daniels, R.E.; PPL Computer Division

    1985-05-01

    General computerized control and data-handling support for TFTR diagnostics is presented within the context of the Central Instrumentation, Control and Data Acquisition (CICADA) System. Procedures, hardware, the interactive man--machine interface, event-driven task scheduling, system-wide arming and data acquisition, and a hierarchical data base of raw data and results are described. Similarities in data structures involved in control, monitoring, and data acquisition afford a simplification of the system functions, based on ''groups'' of devices. Emphases and optimizations appropriate for fusion diagnostic system designs are provided. An off-line data reduction computer system is under development.

  16. Pulsed thyristor trigger control circuit

    NASA Technical Reports Server (NTRS)

    Nola, F. J. (Inventor)

    1984-01-01

    A trigger control circuit is provided for producing firing pulses for the thyristor of a thyristor control system such as a power factor controller. The control circuit overcomes thyristor triggering problems involved with the current lag associated with controlling inductive loads and utilizes a phase difference signal, already present in the power factor controller, in deriving a signal for inhibiting generation of a firing pulse until no load current is flowing from the preceding half cycle and thereby ensuring that the thyristor is triggered on during each half cycle.

  17. Triggered Release from Polymer Capsules

    SciTech Connect

    Esser-Kahn, Aaron P.; Odom, Susan A.; Sottos, Nancy R.; White, Scott R.; Moore, Jeffrey S.

    2011-07-06

    Stimuli-responsive capsules are of interest in drug delivery, fragrance release, food preservation, and self-healing materials. Many methods are used to trigger the release of encapsulated contents. Here we highlight mechanisms for the controlled release of encapsulated cargo that utilize chemical reactions occurring in solid polymeric shell walls. Triggering mechanisms responsible for covalent bond cleavage that result in the release of capsule contents include chemical, biological, light, thermal, magnetic, and electrical stimuli. We present methods for encapsulation and release, triggering methods, and mechanisms and conclude with our opinions on interesting obstacles for chemically induced activation with relevance for controlled release.

  18. Seismology: dynamic triggering of earthquakes.

    PubMed

    Gomberg, Joan; Johnson, Paul

    2005-10-01

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar. PMID:16208360

  19. The Sandia transportable triggered lightning instrumentation facility

    NASA Technical Reports Server (NTRS)

    Schnetzer, George H.; Fisher, Richard J.

    1991-01-01

    Development of the Sandia Transportable Triggered Lightning Instrumentation Facility (SATTLIF) was motivated by a requirement for the in situ testing of a munitions storage bunker. Transfer functions relating the incident flash currents to voltages, currents, and electromagnetic field values throughout the structure will be obtained for use in refining and validating a lightning response computer model of this type of structure. A preliminary shakedown trial of the facility under actual operational conditions was performed during summer of 1990 at the Kennedy Space Center's (KSC) rocket-triggered lightning test site. A description is given of the SATTLIF, which is readily transportable on a single flatbed truck of by aircraft, and its instrumentation for measuring incident lightning channel currents and the responses of the systems under test. Measurements of return-stroke current peaks obtained with the SATTLIF are presented. Agreement with data acquired on the same flashes with existing KSC instrumentation is, on average, to within approximately 7 percent. Continuing currents were measured with a resolution of approximately 2.5 A. This field trial demonstrated the practicality of using a transportable triggered lightning facility for specialized test applications.

  20. The Sandia Transportable Triggered Lightning Instrumentation Facility

    SciTech Connect

    Schnetzer, G.H.; Fisher, R.J.

    1991-01-01

    Development of the Sandia Transportable Triggered Lightning Instrumentation Facility (SATTLIF) was motivated by a requirement for the in situ testing of munitions storage bunker. Transfer functions relating the incident flash currents to voltages, currents, and electromagnetic field values throughout the structure will be obtained for use in refining and validating a lightning response computer model of this type of structure. A preliminary shakedown trial of the facility under actual operational conditions was performed during the summer of 1990 at the Kennedy Space Center's (KSC) rocket-triggered lightning test site in Florida. A description is given of the SATTLIF, which is readily transportable on a single flatbed truck or by aircraft, and its instrumentation for measuring incident lightning channel currents and the responses of systems under test. Measurements of return-stroke current peaks obtained with the SATLLIF are presented. Agreement with data acquired on the same flashes with existing KSC instrumentation is, on average, to within {approximately}7 percent. Continuing currents were measured with a resolution of {approximately}2.5 A. This field trial demonstrated the practicality of using a transportable triggered lightning facility for specialized test applications. 5 refs., 12 figs., 1 tab.

  1. Excessive acquisition in hoarding.

    PubMed

    Frost, Randy O; Tolin, David F; Steketee, Gail; Fitch, Kristin E; Selbo-Bruns, Alexandra

    2009-06-01

    Compulsive hoarding (the acquisition of and failure to discard large numbers of possessions) is associated with substantial health risk, impairment, and economic burden. However, little research has examined separate components of this definition, particularly excessive acquisition. The present study examined acquisition in hoarding. Participants, 878 self-identified with hoarding and 665 family informants (not matched to hoarding participants), completed an Internet survey. Among hoarding participants who met criteria for clinically significant hoarding, 61% met criteria for a diagnosis of compulsive buying and approximately 85% reported excessive acquisition. Family informants indicated that nearly 95% exhibited excessive acquisition. Those who acquired excessively had more severe hoarding; their hoarding had an earlier onset and resulted in more psychiatric work impairment days; and they experienced more symptoms of obsessive-compulsive disorder, depression, and anxiety. Two forms of excessive acquisition (buying and free things) each contributed independent variance in the prediction of hoarding severity and related symptoms. PMID:19261435

  2. Excessive Acquisition in Hoarding

    PubMed Central

    Frost, Randy O.; Tolin, David F.; Steketee, Gail; Fitch, Kristin E.; Selbo-Bruns, Alexandra

    2009-01-01

    Compulsive hoarding (the acquisition of and failure to discard large numbers of possessions) is associated with substantial health risk, impairment, and economic burden. However, little research has examined separate components of this definition, particularly excessive acquisition. The present study examined acquisition in hoarding. Participants, 878 self-identified with hoarding and 665 family informants (not matched to hoarding participants), completed an internet survey. Among hoarding participants who met criteria for clinically significant hoarding, 61% met criteria for a diagnosis of compulsive buying and approximately 85% reported excessive acquisition. Family informants indicated that nearly 95% exhibited excessive acquisition. Those who acquired excessively had more severe hoarding; their hoarding had an earlier onset and resulted in more psychiatric work impairment days; and they experienced more symptoms of obsessive-compulsive disorder, depression, and anxiety. Two forms of excessive acquisition (buying and free things) each contributed independent variance in the prediction of hoarding severity and related symptoms. PMID:19261435

  3. Streamlined acquisition handbook

    NASA Technical Reports Server (NTRS)

    1990-01-01

    NASA has always placed great emphasis on the acquisition process, recognizing it as among its most important activities. This handbook is intended to facilitate the application of streamlined acquisition procedures. The development of these procedures reflects the efforts of an action group composed of NASA Headquarters and center acquisition professionals. It is the intent to accomplish the real change in the acquisition process as a result of this effort. An important part of streamlining the acquisition process is a commitment by the people involved in the process to accomplishing acquisition activities quickly and with high quality. Too often we continue to accomplish work in 'the same old way' without considering available alternatives which would require no changes to regulations, approvals from Headquarters, or waivers of required practice. Similarly, we must be sensitive to schedule opportunities throughout the acquisition cycle, not just once the purchase request arrives at the procurement office. Techniques that have been identified as ways of reducing acquisition lead time while maintaining high quality in our acquisition process are presented.

  4. The D0 upgrade trigger

    SciTech Connect

    Eno, S.

    1994-09-01

    The current trigger system for the D0 detector at Fermilab`s Tevatron will need to be upgraded when the Min Injector is installed and the Tevatron can operate at luminosities exceeding 10{sup 32} cm{sup {minus}2}s{sup {minus}1} and with a crossing time of 132 ns. We report on preliminary designs for upgrades to the trigger system for the Main Injector era.

  5. Data acquisition electronics for gamma ray emission tomography using width-modulated leading-edge discriminators.

    PubMed

    Lage, E; Tapias, G; Villena, J; Desco, M; Vaquero, J J

    2010-08-01

    We present a new high-performance and low-cost approach for implementing radiation detection acquisition systems. The basic elements used are charge-integrating ADCs and a set of components encapsulated in an HDL (hardware definition language) library which makes it possible to implement several acquisition tasks such as time pickoff and coincidence detection using a new and simple trigger technique that we name WMLET (width-modulated leading-edge timing). As proof of concept, a 32-channel hybrid PET/SPECT acquisition system based on these elements was developed and tested. This demonstrator consists of a master module responsible for the generation and distribution of trigger signals, 2 x 16-channel ADC cards (12-bit resolution) for data digitization and a 32-bit digital I/O PCI card for handling data transmission to a personal computer. System characteristics such as linearity, maximum transmission rates or timing resolution in coincidence mode were evaluated with test and real detector signals. Imaging capabilities of the prototype were also evaluated using different detector configurations. The performance tests showed that this implementation is able to handle data rates in excess of 600k events s(-1) when acquiring simultaneously 32 channels (96-byte events). ADC channel linearity is >98.5% in energy quantification. Time resolution in PET mode for the tested configurations ranges from 3.64 ns FWHM to 7.88 ns FWHM when signals from LYSO-based detectors are used. The measured energy resolution matched the expected values for the detectors evaluated and single elements of crystal matrices can be neatly separated in the acquired flood histograms. PMID:20647602

  6. Data acquisition electronics for gamma ray emission tomography using width-modulated leading-edge discriminators

    NASA Astrophysics Data System (ADS)

    Lage, E.; Tapias, G.; Villena, J.; Desco, M.; Vaquero, J. J.

    2010-08-01

    We present a new high-performance and low-cost approach for implementing radiation detection acquisition systems. The basic elements used are charge-integrating ADCs and a set of components encapsulated in an HDL (hardware definition language) library which makes it possible to implement several acquisition tasks such as time pickoff and coincidence detection using a new and simple trigger technique that we name WMLET (width-modulated leading-edge timing). As proof of concept, a 32-channel hybrid PET/SPECT acquisition system based on these elements was developed and tested. This demonstrator consists of a master module responsible for the generation and distribution of trigger signals, 2 × 16-channel ADC cards (12-bit resolution) for data digitization and a 32-bit digital I/O PCI card for handling data transmission to a personal computer. System characteristics such as linearity, maximum transmission rates or timing resolution in coincidence mode were evaluated with test and real detector signals. Imaging capabilities of the prototype were also evaluated using different detector configurations. The performance tests showed that this implementation is able to handle data rates in excess of 600k events s-1 when acquiring simultaneously 32 channels (96-byte events). ADC channel linearity is >98.5% in energy quantification. Time resolution in PET mode for the tested configurations ranges from 3.64 ns FWHM to 7.88 ns FWHM when signals from LYSO-based detectors are used. The measured energy resolution matched the expected values for the detectors evaluated and single elements of crystal matrices can be neatly separated in the acquired flood histograms.

  7. The CMS Level-1 Trigger Barrel Track Finder

    NASA Astrophysics Data System (ADS)

    Ero, J.; Evangelou, I.; Flouris, G.; Foudas, C.; Guiducci, L.; Loukas, N.; Manthos, N.; Papadopoulos, I.; Paradas, E.; Sotiropoulos, S.; Sphicas, P.; Triossi, A.; Wulz, C.

    2016-03-01

    The design and performance of the upgraded CMS Level-1 Trigger Barrel Muon Track Finder (BMTF) is presented. Monte Carlo simulation data as well as cosmic ray data from a CMS muon detector slice test have been used to study in detail the performance of the new track finder. The design architecture is based on twelve MP7 cards each of which uses a Xilinx Virtex-7 FPGA and can receive and transmit data at 10 Gbps from 72 input and 72 output fibers. According to the CMS Trigger Upgrade TDR the BMTF receives trigger primitive data which are computed using both RPC and DT data and transmits data from a number of muon candidates to the upgraded Global Muon Trigger. Results from detailed studies of comparisons between the BMTF algorithm results and the results of a C++ emulator are also presented. The new BMTF will be commissioned for data taking in 2016.

  8. An experimental comparison of triggered and random pulse train uncertainties

    SciTech Connect

    Henzlova, Daniela; Menlove, Howard O; Swinhoe, Martyn T

    2010-01-01

    response) have used only one of the two analysis methods for the nuclear material assay. The aim of this study is to provide a systematic comparison of the precision of the measured S, D, T rates and {sup 240}Pu effective mass obtained using the above mentioned pulse train sampling techniques. In order to perform this task, a LANL developed list mode based data acquisition system is used, where the entire pulse train is recorded and subsequently analyzed. The list mode acquisition brings an essential advantage for this type of comparison, since the very same pulse train can be analyzed using signal-triggered as well as randomly triggered counting gates. The aim of this study is not only to compare the precision of signal-triggered versus random triggered sampling techniques, but also to investigate the influence of fast accidental sampling on the precision of signal-triggered results. In addition the different random sampling techniques used in safeguards are investigated. For this purpose we implement two types of random sampling - non-overlapping gates (Feynrnan approach) and periodic overlapping gates (fast accidentals). In the following sections the equations utilized in the pulse train analysis are described, experimental setup and measurement techniques are discussed and finally the results are summarized and discussed.

  9. Simulation of rockfalls triggered by earthquakes

    USGS Publications Warehouse

    Kobayashi, Y.; Harp, E.L.; Kagawa, T.

    1990-01-01

    A computer program to simulate the downslope movement of boulders in rolling or bouncing modes has been developed and applied to actual rockfalls triggered by the Mammoth Lakes, California, earthquake sequence in 1980 and the Central Idaho earthquake in 1983. In order to reproduce a movement mode where bouncing predominated, we introduced an artificial unevenness to the slope surface by adding a small random number to the interpolated value of the mid-points between the adjacent surveyed points. Three hundred simulations were computed for each site by changing the random number series, which determined distances and bouncing intervals. The movement of the boulders was, in general, rather erratic depending on the random numbers employed, and the results could not be seen as deterministic but stochastic. The closest agreement between calculated and actual movements was obtained at the site with the most detailed and accurate topographic measurements. ?? 1990 Springer-Verlag.

  10. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  11. Acquisition of teleological descriptions

    NASA Astrophysics Data System (ADS)

    Franke, David W.

    1992-03-01

    Teleology descriptions capture the purpose of an entity, mechanism, or activity with which they are associated. These descriptions can be used in explanation, diagnosis, and design reuse. We describe a technique for acquiring teleological descriptions expressed in the teleology language TeD. Acquisition occurs during design by observing design modifications and design verification. We demonstrate the acquisition technique in an electronic circuit design.

  12. Assessment of image quality and radiation dose of prospectively ECG-triggered adaptive dual-source coronary computed tomography angiography (cCTA) with arrhythmia rejection algorithm in systole versus diastole: a retrospective cohort study.

    PubMed

    Lee, Ashley M; Beaudoin, Jonathan; Engel, Leif-Christopher; Sidhu, Manavjot S; Abbara, Suhny; Brady, Thomas J; Hoffmann, Udo; Ghoshhajra, Brian B

    2013-08-01

    In this study, we sought to evaluate the image quality and effective radiation dose of prospectively ECG-triggered adaptive systolic (PTA-systolic) dual-source CTA versus prospectively triggered adaptive diastolic (PTA-diastolic) dual-source CTA in patients of unselected heart rate and rhythm. This retrospective cohort study consisted of 41 PTA-systolic and 41 matched PTA-diastolic CTA patients whom underwent clinically indicated 128-slice dual source CTA between December 2010 to June 2012. Image quality and motion artifact score (both on a Likert scale 1-4 with 4 being the best), effective dose, and CTDIvol were compared. The effect of heart rate (HR) and heart rate variability [HRV] on image motion artifact score and CTDIvol was analyzed with Pearson's correlation coefficient. All 82 exams were considered diagnostic with 0 non-diagnostic segments. PTA-systolic CTA patients had a higher maximum HR, wider HRV, were less likely to be in sinus rhythm, and received less beta-blocker vs. PTA-diastolic CTA patients. No difference in effective dose was observed (PTA-systolic vs. PTA-diastolic CTA: 2.9 vs. 2.2 mSv, p = 0.26). Image quality score (3.3 vs. 3.5, p < 0.05) and motion artifact score (3.5 vs. 3.8, p < 0.05) were lower in PTA-systolic CTAs than in PTA-diastolic CTAs. For PTA-systolic CTAs, an increase in HR was not associated with a negative impact on motion artifact score nor CTDIvol. For PTA-diastolic CTA, an increase in HR was associated with increased motion artifacts and CTDIvol. HRV demonstrated no correlation with motion artifact and CTDIvol for both PTA-systolic and PTA-diastolic CTAs. In conclusion, both PTA-diastolic CTA and PTA-systolic CTA yielded diagnostic examinations at unselected heart rates and rhythms with similar effective radiation, but PTA-systolic CTA resulted in more consistent radiation exposure and image quality across a wide range of rates and rhythms. PMID:23526082

  13. High-definition computed tomography for coronary artery stents: image quality and radiation doses for low voltage (100 kVp) and standard voltage (120 kVp) ECG-triggered scanning.

    PubMed

    Lee, Ji Won; Kim, Chang Won; Lee, Han Cheol; Wu, Ming-Ting; Hwangbo, Lee; Choo, Ki Seok; Kim, June Hong; Lee, Ki-Nam; Kim, Jin You; Jeong, Yeon Joo

    2015-06-01

    The noninvasive assessment of coronary stents by coronary CT angiography (CCTA) is an attractive method. However, the radiation dose associated with CCTA remains a concern for patients. The purpose of this study is to compare the radiation doses and image qualities of CCTA performed using tube voltages of 100 or 120 kVp for the evaluation of coronary stents. After receiving institutional review board approval, 53 consecutive patients with previously implanted stents (101 stents) underwent 64-slice CCTA. Patients were divided into three different protocol groups, namely, prospective ECG triggering at 100 kVp, prospective ECG triggering at 120 kVp, or retrospective gating at 100 kVp. Two reviewers qualitatively scored the quality of the resulting images for coronary stents and determined levels of artificial lumen narrowing (ALN), stent lumen attenuation increase ratio (SAIR), image noise, and radiation dose parameters. No significant differences were found between the three protocol groups concerning qualitative image quality or SAIR. Coronary lumen attenuation and in-stent attenuation of 100 kVp prospective CCTA (P-CCTA) were higher than in the 120 kVp P-CCTA protocol (all Ps < 0.001). Mean ALN was significantly lower for 100 kVp P-CCTA than for 100 kVp retrospective CCTA (R-CCTA, P = 0.007). The mean effective radiation dose was significantly lower (P < 0.001) for 100 kVp P-CCTA (3.3 ± 0.4 mSv) than for the other two protocols (100 kVp R-CCTA 6.7 ± 1.0 mSv, 120 kVp P-CCTA 4.6 ± 1.2 mSv). We conclude that the use of 100 kVp P-CCTA can reduce radiation doses for patients while maintaining the imaging quality of 100 kVp R-CCTA and 120 kVp P-CCTA for the evaluation of coronary stents. PMID:26022439

  14. Digital self-triggered robust control of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Di Benedetto, M. D.; Di Gennaro, S.; D'Innocenzo, A.

    2013-09-01

    In this paper, we develop novel results on self-triggered control of nonlinear systems, subject to perturbations, and sensing/computation/actuation delays. First, considering an unperturbed nonlinear system with bounded delays, we provide conditions that guarantee the existence of a self-triggered control strategy stabilizing the closed-loop system. Then, considering parameter uncertainties, disturbances and bounded delays, we provide conditions guaranteeing the existence of a self-triggered strategy that keeps the state arbitrarily close to the equilibrium point. In both cases, we provide a methodology for the computation of the next execution time. We show on an example the relevant benefits obtained with this approach in terms of energy consumption with respect to control algorithms based on a constant sampling with a sensible reduction of the average sampling time.

  15. The HyperCP data acquisition system

    SciTech Connect

    Kaplan, D.M.; E871 Collaboration

    1997-06-01

    For the HyperCP experiment at Fermilab, we have assembled a data acquisition system that records on up to 45 Exabyte 8505 tape drives in parallel at up to 17 MB/s. During the beam spill, data are acquired from the front-end digitization systems at {approx} 60 MB/s via five parallel data paths. The front-end systems achieve typical readout deadtime of {approx} 1 {micro}s per event, allowing operation at 75-kHz trigger rate with {approx_lt}30% deadtime. Event building and tapewriting are handled by 15 Motorola MVME167 processors in 5 VME crates.

  16. Multiprocessor data acquisition for NordBall

    NASA Astrophysics Data System (ADS)

    Jerrestam, Dan; Forycki, A.; Holm, A.; Høy-Christensen, P.; Jian Shen, T.

    1989-12-01

    For the NordBall multidetector system a versatile data acquisition system has been developed around the VME bus utilizing 68010 processors. The readout of the instrument is based on a generalized READER concept. READERs are CPU boards reading hardware in parallel for each event. In the FERA bus the final fast logical decision is made before an event is to be considered as being present for readout. Synchronization with the trigger for readout, coming from the FERA bus system, is performed by a special hardware unit. Synchronization on event level between the READERs is done by the same hardware unit monitored by a master CPU.

  17. The CDMS II data acquisition system

    SciTech Connect

    Bauer, D.A.; Burke, S.; Cooley, J.; Crisler, M.; Cushman, P.; DeJongh, F.; Duong, L.; Ferril, R.; Golwala, S.R.; Hall, J.; Holmgren, D.; /Fermilab /Texas A-M

    2011-01-01

    The Data Acquisition System for the CDMS II dark matter experiment was designed and built when the experiment moved to its new underground installation at the Soudan Lab. The combination of remote operation and increased data load necessitated a completely new design. Elements of the original LabView system remained as stand-alone diagnostic programs, but the main data processing moved to a VME-based system with custom electronics for signal conditioning, trigger formation and buffering. The data rate was increased 100-fold and the automated cryogenic system was linked to the data acquisition. A modular server framework with associated user interfaces was implemented in Java to allow control and monitoring of the entire experiment remotely.

  18. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  19. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  20. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  1. Slow earthquakes triggered by typhoons.

    PubMed

    Liu, ChiChing; Linde, Alan T; Sacks, I Selwyn

    2009-06-11

    The first reports on a slow earthquake were for an event in the Izu peninsula, Japan, on an intraplate, seismically active fault. Since then, many slow earthquakes have been detected. It has been suggested that the slow events may trigger ordinary earthquakes (in a context supported by numerical modelling), but their broader significance in terms of earthquake occurrence remains unclear. Triggering of earthquakes has received much attention: strain diffusion from large regional earthquakes has been shown to influence large earthquake activity, and earthquakes may be triggered during the passage of teleseismic waves, a phenomenon now recognized as being common. Here we show that, in eastern Taiwan, slow earthquakes can be triggered by typhoons. We model the largest of these earthquakes as repeated episodes of slow slip on a reverse fault just under land and dipping to the west; the characteristics of all events are sufficiently similar that they can be modelled with minor variations of the model parameters. Lower pressure results in a very small unclamping of the fault that must be close to the failure condition for the typhoon to act as a trigger. This area experiences very high compressional deformation but has a paucity of large earthquakes; repeating slow events may be segmenting the stressed area and thus inhibiting large earthquakes, which require a long, continuous seismic rupture. PMID:19516339

  2. Modeling event building architecture for the triggerless data acquisition system for PANDA experiment at the HESR facility at FAIR/GSI

    NASA Astrophysics Data System (ADS)

    Korcyl, K.; Konorov, I.; Kühn, W.; Schmitt, L.

    2012-12-01

    A novel architecture is being proposed for the data acquisition and trigger system of the PANDA experiment at the HESR facility at FAIR/GSI. The experiment will run without hardware trigger signal using timestamps to correlate detector data from a given time window. The broad physics program in combination with the high rate of 2 * 107 interactions per second requires very selective filtering algorithms accessing information from many detectors. Therefore the effective filtering will happen later than in today's systems ie. after the event building. To assess that, the complete architecture will be built of two stages: the data concentrator stage providing event building and the rate reduction stage. For the former stage, which requires a throughput of 100 GB/s to perform event building, we propose two layers of ATCA crates filled with Compute Nodes - modules designed at IHEP and University of Giessen for trigger and data acquisition systems. Currently each board is equipped with 5 Virtex4 FX60 FPGAs and high bandwidth connectivity is provided by 8 front panel RocketIO ports and 12 backplane ports for the inter-module communication. We designed simplified models of the components of the architecture and using the SystemC library as support for the discrete event simulations, demonstrate the expected throughput of the full-size system. We also show impact of some architectural choices and key parameters on the architecture's performance.

  3. WRATS Integrated Data Acquisition System

    NASA Technical Reports Server (NTRS)

    Piatak, David J.

    2008-01-01

    The Wing and Rotor Aeroelastic Test System (WRATS) data acquisition system (DAS) is a 64-channel data acquisition display and analysis system specifically designed for use with the WRATS 1/5-scale V-22 tiltrotor model of the Bell Osprey. It is the primary data acquisition system for experimental aeroelastic testing of the WRATS model for the purpose of characterizing the aeromechanical and aeroelastic stability of prototype tiltrotor configurations. The WRATS DAS was also used during aeroelastic testing of Bell Helicopter Textron s Quad-Tiltrotor (QTR) design concept, a test which received international attention. The LabVIEW-based design is portable and capable of powering and conditioning over 64 channels of dynamic data at sampling rates up to 1,000 Hz. The system includes a 60-second circular data archive, an integrated model swashplate excitation system, a moving block damping application for calculation of whirl flutter mode subcritical damping, a loads and safety monitor, a pilot-control console display, data analysis capabilities, and instrumentation calibration functions. Three networked computers running custom-designed LabVIEW software acquire data through National Instruments data acquisition hardware. The aeroelastic model (see figure) was tested with the DAS at two facilities at NASA Langley, the Transonic Dynamics Tunnel (TDT) and the Rotorcraft Hover Test Facility (RHTF). Because of the need for seamless transition between testing at these facilities, DAS is portable. The software is capable of harmonic analysis of periodic time history data, Fast Fourier Transform calculations, power spectral density calculations, and on-line calibration of test instrumentation. DAS has a circular buffer archive to ensure critical data is not lost in event of model failure/incident, as well as a sample-and-hold capability for phase-correct time history data.

  4. Industrial accidents triggered by lightning.

    PubMed

    Renni, Elisabetta; Krausmann, Elisabeth; Cozzani, Valerio

    2010-12-15

    Natural disasters can cause major accidents in chemical facilities where they can lead to the release of hazardous materials which in turn can result in fires, explosions or toxic dispersion. Lightning strikes are the most frequent cause of major accidents triggered by natural events. In order to contribute towards the development of a quantitative approach for assessing lightning risk at industrial facilities, lightning-triggered accident case histories were retrieved from the major industrial accident databases and analysed to extract information on types of vulnerable equipment, failure dynamics and damage states, as well as on the final consequences of the event. The most vulnerable category of equipment is storage tanks. Lightning damage is incurred by immediate ignition, electrical and electronic systems failure or structural damage with subsequent release. Toxic releases and tank fires tend to be the most common scenarios associated with lightning strikes. Oil, diesel and gasoline are the substances most frequently released during lightning-triggered Natech accidents. PMID:20817399

  5. Inexpensive Data Acquisition with a Sound Card

    ERIC Educational Resources Information Center

    Hassan, Umer; Pervaiz, Saad; Anwar, Muhammad Sabieh

    2011-01-01

    Signal generators, oscilloscopes, and data acquisition (DAQ) systems are standard components of the modern experimental physics laboratory. The sound card, a built-in component in the ubiquitous personal computer, can be utilized for all three of these tasks and offers an attractive option for labs in developing countries such as…

  6. Know Your Smoking Triggers | Smokefree.gov

    Cancer.gov

    Triggers are the things that make you want to smoke. Different people have different triggers, like a stressful situation, sipping coffee, going to a party, or smelling cigarette smoke. Most triggers fall into one of these four categories: Emotional Pattern Social Withdrawal Knowing your triggers and understanding the best way to deal with them is your first line of defense.

  7. Integral magnetic ignition pickup trigger

    SciTech Connect

    King, R.

    1992-10-27

    This patent describes a trigger system for the ignition system of an internal combustion engine having a crankcase with a rotatable crankshaft therein, and a flywheel on one end of the crankcase connected to an end of the crankshaft. It comprises: a nonferromagnetic disk-shaped hub for connection to the crankshaft and rotatable therewith on the end opposite the flywheel; and a stationary sensor mounted adjacent the hub for detecting impulses from the magnetically responsive elements as the hub rotates and utilizing the impulses to trigger the ignition system.

  8. Human target acquisition performance

    NASA Astrophysics Data System (ADS)

    Teaney, Brian P.; Du Bosq, Todd W.; Reynolds, Joseph P.; Thompson, Roger; Aghera, Sameer; Moyer, Steven K.; Flug, Eric; Espinola, Richard; Hixson, Jonathan

    2012-06-01

    The battlefield has shifted from armored vehicles to armed insurgents. Target acquisition (identification, recognition, and detection) range performance involving humans as targets is vital for modern warfare. The acquisition and neutralization of armed insurgents while at the same time minimizing fratricide and civilian casualties is a mounting concern. U.S. Army RDECOM CERDEC NVESD has conducted many experiments involving human targets for infrared and reflective band sensors. The target sets include human activities, hand-held objects, uniforms & armament, and other tactically relevant targets. This paper will define a set of standard task difficulty values for identification and recognition associated with human target acquisition performance.

  9. Interactive knowledge acquisition tools

    NASA Technical Reports Server (NTRS)

    Dudziak, Martin J.; Feinstein, Jerald L.

    1987-01-01

    The problems of designing practical tools to aid the knowledge engineer and general applications used in performing knowledge acquisition tasks are discussed. A particular approach was developed for the class of knowledge acquisition problem characterized by situations where acquisition and transformation of domain expertise are often bottlenecks in systems development. An explanation is given on how the tool and underlying software engineering principles can be extended to provide a flexible set of tools that allow the application specialist to build highly customized knowledge-based applications.

  10. Offline Processing in the Online Computer Farm

    NASA Astrophysics Data System (ADS)

    Cardoso, L. G.; Gaspar, C.; Callot, O.; Closier, J.; Neufeld, N.; Frank, M.; Jost, B.; Charpentier, P.; Liu, G.

    2012-12-01

    LHCb is one of the 4 experiments at the LHC accelerator at CERN. LHCb has approximately 1500 PCs for processing the High Level Trigger (HLT) during physics data acquisition. During periods when data acquisition is not required or the resources needed for data acquisition are reduced most of these PCs are idle or very little used. In these periods it is possible to profit from the unused processing capacity to reprocess earlier datasets with the newest applications (code and calibration constants), thus reducing the CPU capacity needed on the Grid. The offline computing environment is based on LHCbDIRAC (Distributed Infrastructure with Remote Agent Control) to process physics data on the Grid. In DIRAC, agents are started on Worker Nodes, pull available jobs from the DIRAC central WMS (Workload Management System) and process them on the available resources. A Control System was developed which is able to launch, control and monitor the agents for the offline data processing on the HLT Farm. It can do so without overwhelming the offline resources (e.g. DBs) and in case of change of the accelerator planning it can easily return the used resources for online purposes. This control system is based on the existing Online System Control infrastructure, the PVSS SCADA and the FSM toolkit.

  11. The ALICE data acquisition system

    NASA Astrophysics Data System (ADS)

    Carena, F.; Carena, W.; Chapeland, S.; Chibante Barroso, V.; Costa, F.; Dénes, E.; Divià, R.; Fuchs, U.; Grigore, A.; Kiss, T.; Simonetti, G.; Soós, C.; Telesca, A.; Vande Vyvre, P.; von Haller, B.

    2014-03-01

    In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN Large Hadron Collider (LHC). The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links. We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with a presentation of all the major components of the systems, as currently realized. We also report on the performance achieved during the first period of data taking (from 2009 to 2013) often exceeding those specified in the DAQ Technical Design Report.

  12. A triggerless digital data acquisition system for nuclear decay experiments

    SciTech Connect

    Agramunt, J.; Tain, J. L.; Albiol, F.; Algora, A.; Estevez, E.; Giubrone, G.; Jordan, M. D.; Molina, F.; Rubio, B.; Valencia, E.

    2013-06-10

    In nuclear decay experiments an important goal of the Data Acquisition (DAQ) system is to allow the reconstruction of time correlations between signals registered in different detectors. Classically DAQ systems are based in a trigger that starts the event acquisition, and all data related with the event of that trigger are collected as one compact structure. New technologies and electronics developments offer new possibilities to nuclear experiments with the use of sampling ADC-s. This type of ADC-s is able to provide the pulse shape, height and a time stamp of the signal. This new feature (time stamp) allows new systems to run without an event trigger. Later, the event can be reconstructed using the time stamp information. In this work we present a new DAQ developed for {beta}-delayed neutron emission experiments. Due to the long moderation time of neutrons, we opted for a self-trigger DAQ based on commercial digitizers. With this DAQ a negligible acquisition dead time was achieved while keeping a maximum of event information and flexibility in time correlations.

  13. Rocket-triggered lightning strikes and forest fire ignition

    NASA Technical Reports Server (NTRS)

    Fenner, James

    1990-01-01

    The following are presented: (1) background information on the rocket-triggered lightning project an Kennedy Space Center (KSC); (2) a summary of the forecasting problem; (3) the facilities and equipment available for undertaking field experiments at KSC; (4) previous research activity performed; (5) a description of the atmospheric science field laboratory near Mosquito Lagoon on the KSC complex; (6) methods of data acquisition; and (7) present results. New sources of data for the 1990 field experiment include measuring the electric field in the lower few thousand feet of the atmosphere by suspending field measuring devices below a tethered balloon, and measuring the electric field intensity in clouds and in the atmosphere with aircraft. The latter program began in July of 1990. Also, future prospects for both triggered lightning and forest fire research at KSC are listed.

  14. Rocket-triggered lightning strikes and forest fire ignition

    NASA Technical Reports Server (NTRS)

    Fenner, James H.

    1989-01-01

    Background information on the rocket-triggered lightning project at Kennedy Space Center (KSC), a summary of the forecasting problem there, the facilities and equipment available for undertaking field experiments at KSC, previous research activity performed, a description of the atmospheric science field laboratory near Mosquito Lagoon on the KSC complex, methods of data acquisition, and present results are discussed. New sources of data for the 1989 field experiment include measuring the electric field in the lower few thousand feet of the atmosphere by suspending field measuring devices below a tethered balloon. Problems encountered during the 1989 field experiment are discussed. Future prospects for both triggered lightning and lightning-kindled forest fire research at KSC are listed.

  15. An 'Anomalous' Triggered Lightning Flash in Florida

    NASA Astrophysics Data System (ADS)

    Gamerota, W. R.; Uman, M. A.; Hill, J. D.; Pilkey, J. T.; Ngin, T.; Jordan, D. M.; Mata, C.; Mata, A.

    2012-12-01

    Classical (grounded wire) rocket-and-wire triggered lightning flashes whose leaders do not traverse the path of the wire remnants are sometimes referred to as 'anomalous'. We present high-speed video images captured at 10 kilo-frames per second (kfps), with supporting data, to characterize an 'anomalous' rocket-triggered lightning flash that occurred on 15 May 2012 at the International Center for Lightning Research and Testing (ICLRT) in north-central Florida. The event begins as a classical rocket-triggered lightning flash with an upward positive leader (UPL) initiating from the tip of the wire at a height of about 280 m above ground level. The top 259 m of the trailing wire explodes 2.7 s after the rocket exits the launch tube, while the bottom 17 m of the wire does not explode (does not become luminous). Approximately 1.4 ms after wire explosion, a stepped leader initiates a few meters above the top of the wire remnants and propagates downward, attaching to the top of a grounded utility pole 2.1 ms after initiation and 117 m southwest of the launching facility. Beginning 600 μs prior to this sustained stepped leader development, attempted stepped leaders (luminous steps emanating from the UPL channel above the wire remnants) are observed in three locations: 20 m and 5 m above the top of the wire remnants and at the top of the wire remnants. Correlated electric field derivative (dE/dt), channel-base current, and high-speed video captured at 300 kfps reveal an electrical discharge of peak current 365 A initiating from about 17 m above the launching facility, apparently the top of the unexploded triggering wire, when the stepped leader is no more than 60 m above ground level. There are significant differences between the 'anomalous' triggered lightning flash described here and those observed in New Mexico and in France in the late 1970s and early 1980s: First, the time duration between explosion of our wire and the sustained stepped leader development a few meters

  16. The central trigger control system of the CMS experiment at CERN

    NASA Astrophysics Data System (ADS)

    Taurok, A.; Arnold, B.; Bergauer, H.; Eichberger, M.; Erö, J.; Hartl, Ch; Jeitler, M.; Kastner, K.; Mikulec, I.; Neuherz, B.; Padrta, M.; Sakulin, H.; Strauss, J.; Wulz, C.-E.; Varela, J.; Smith, W. H.

    2011-03-01

    The Large Hadron Collider will deliver up to 32 million physics collisions per second. This rate is far too high to be processed by present-day computer farms, let alone stored on disk by the experiments for offline analysis. A fast selection of interesting events must therefore be made. In the CMS experiment, this is implemented in two stages: the Level-1 Trigger of the CMS experiment uses custom-made, fast electronics, while the experiment's high-level trigger is implemented in computer farms. The Level-1 Global Trigger electronics has to receive signals from the subdetector systems that enter the trigger (mostly from muon detectors and calorimeters), synchronize them, determine if a pre-set trigger condition is fulfilled, check if the various subsystems are ready to accept triggers based on information from the Trigger Throttling System and on calculations of possible dead-times, and finally distribute the trigger decision (``Level-1 Accept'') together with timing signals to the subdetectors over the so-called ``Trigger, Timing and Control'' distribution tree of the experiment. These functions are fulfilled by several specialized, custom-made VME modules, most of which are housed in one crate. The overall control is exerted by the central ``Trigger Control System'', which is described in this paper. It consists of one main module and several ancillary boards for input and output functions.

  17. A New Look at Trigger Point Injections

    PubMed Central

    Wong, Clara S. M.; Wong, Steven H. S.

    2012-01-01

    Trigger point injections are commonly practised pain interventional techniques. However, there is still lack of objective diagnostic criteria for trigger points. The mechanisms of action of trigger point injection remain obscure and its efficacy remains heterogeneous. The advent of ultrasound technology in the noninvasive real-time imaging of soft tissues sheds new light on visualization of trigger points, explaining the effect of trigger point injection by blockade of peripheral nerves, and minimizing the complications of blind injection. PMID:21969825

  18. Acquisition signal transmitter

    NASA Technical Reports Server (NTRS)

    Friedman, Morton L. (Inventor)

    1989-01-01

    An encoded information transmitter which transmits a radio frequency carrier that is amplitude modulated by a constant frequency waveform and thereafter amplitude modulated by a predetermined encoded waveform, the constant frequency waveform modulated carrier constituting an acquisition signal and the encoded waveform modulated carrier constituting an information bearing signal, the acquisition signal providing enhanced signal acquisition and interference rejection favoring the information bearing signal. One specific application for this transmitter is as a distress transmitter where a conventional, legislated audio tone modulated signal is transmitted followed first by the acquisition signal and then the information bearing signal, the information bearing signal being encoded with, among other things, vehicle identification data. The acquistion signal enables a receiver to acquire the information bearing signal where the received signal is low and/or where the received signal has a low signal-to-noise ratio in an environment where there are multiple signals in the same frequency band as the information bearing signal.

  19. High Speed data acquisition

    SciTech Connect

    Cooper, Peter S.

    1998-02-01

    A general introduction to high Speed data acquisition system techniques in modern particle physics experiments is given. Examples are drawn from the SELEX(E781) high statistics charmed baryon production and decay experiment now taking data at Fermilab.

  20. Documentation and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel; Moseley, Warren

    1990-01-01

    Traditional approaches to knowledge acquisition have focused on interviews. An alternative focuses on the documentation associated with a domain. Adopting a documentation approach provides some advantages during familiarization. A knowledge management tool was constructed to gain these advantages.

  1. Rx for Acquisitions Hangups

    ERIC Educational Resources Information Center

    Huleatt, Richard S.

    1973-01-01

    A system of ordering library materials efficiently, quickly and at low cost is presented. The procedure bypasses purchasing departments and helps reduce acquisitions time by authorizing direct ordering by the library. Forms and procedures used are discussed. (1 reference) (DH)

  2. Development of the new trigger for VANDLE neutron detector

    NASA Astrophysics Data System (ADS)

    Hasse, Adam; Taylor, Steven; Daugherty, Hadyn; Grzywacz, Robert

    2014-09-01

    Beta-delayed neutron emission (βn) is the dominant decay channel for the majority of very neutron-rich nuclei. In order to study these decays a new detector system called the Versatile Array of Neutron Detectors at Low Energy (VANDLE) was constructed. A critical part of this neutron time of flight detector is a trigger unit. This trigger is sensitive to electron from beta decay down to very low energies, insensitive to gamma rays and have a good timing performance, better than 1 ns. In order to satisfy these condition, we have developed a new system, which utilizes plastic scintillator but uses recently developed light readout technique, based on the so called Silicon Photomultiplier, manufactured by Sensl. New system has been developed and performance tested using digital data acquisition system at the University of Tennessee and will be utilized in future experiments involving VANDLE. Beta-delayed neutron emission (βn) is the dominant decay channel for the majority of very neutron-rich nuclei. In order to study these decays a new detector system called the Versatile Array of Neutron Detectors at Low Energy (VANDLE) was constructed. A critical part of this neutron time of flight detector is a trigger unit. This trigger is sensitive to electron from beta decay down to very low energies, insensitive to gamma rays and have a good timing performance, better than 1 ns. In order to satisfy these condition, we have developed a new system, which utilizes plastic scintillator but uses recently developed light readout technique, based on the so called Silicon Photomultiplier, manufactured by Sensl. New system has been developed and performance tested using digital data acquisition system at the University of Tennessee and will be utilized in future experiments involving VANDLE. Department of Physics and Astronomy, University of Tennessee, Knoxville, USA.

  3. Making Effective Use of Computer Technology.

    ERIC Educational Resources Information Center

    Ornstein, Allan C.

    1992-01-01

    Six computer applications in education are word processing, computer-assisted instruction, computer-aided design, computer authoring systems, computer data systems, and computer storage. Computers may assist students with three learning stages: acquisition, transformation, and evaluation of information. Advances in computer programing, software,…

  4. FOS Target Acquisition Test

    NASA Astrophysics Data System (ADS)

    Koratkar, Anuradha

    1994-01-01

    FOS onboard target acquisition software capabilities will be verified by this test -- point source binary, point source firmware, point source peak-up, wfpc2 assisted realtime, point source peak-down, taled assisted binary, taled assisted firmware, and nth star binary modes. The primary modes are tested 3 times to determine repeatability. This test is the only test that will verify mode-to-mode acquisition offsets. This test has to be conducted for both the RED and BLUE detectors.

  5. Environmental Triggers of Autoimmune Thyroiditis

    PubMed Central

    Burek, C. Lynne; Talor, Monica V.

    2009-01-01

    Autoimmune thyroiditis is among the most prevalent of all the autoimmunities. Autoimmune thyroiditis is multifactorial with contributions from genetic and environmental factors. Much information has been published about the genetic predisposition to autoimmune thyroiditis both in experimental animals and humans. There is, in contrast, very little data on environmental agents that can serve as the trigger or autoimmunity in a genetically predisposed host. The best-established environmental factor is excess dietary iodine. Increased iodine consumption is strongly implicated as a trigger for thyroiditis, but only in genetically susceptible individuals. However, excess iodine is not the only environmental agent implicated as a trigger leading to autoimmune thyroiditis. There are a wide variety of other synthetic chemicals that affect the thyroid gland or have the ability to promote immune dysfunction in the host. These chemicals are released into the environment by design, such as in pesticides, or as a by-product of industry. Candidate pollutants include polyaromatic hydrocarbons, polybrominated biphenols, and polychlorinated biphenols, among others. Infections are also reputed to trigger autoimmunity and may act alone or in concert with environmental chemicals. We have utilized a unique animal model, the NOD.H2h4 mouse to explore the influence of iodine and other environmental factors on autoimmune thyroiditis. PMID:19818584

  6. Environmental triggers of autoimmune thyroiditis.

    PubMed

    Burek, C Lynne; Talor, Monica V

    2009-01-01

    Autoimmune thyroiditis is among the most prevalent of all the autoimmunities. Autoimmune thyroiditis is multifactorial with contributions from genetic and environmental factors. Much information has been published about the genetic predisposition to autoimmune thyroiditis both in experimental animals and humans. There is, in contrast, very little data on environmental agents that can serve as the trigger for autoimmunity in a genetically predisposed host. The best-established environmental factor is excess dietary iodine. Increased iodine consumption is strongly implicated as a trigger for thyroiditis, but only in genetically susceptible individuals. However, excess iodine is not the only environmental agent implicated as a trigger leading to autoimmune thyroiditis. There are a wide variety of other synthetic chemicals that affect the thyroid gland or have the ability to promote immune dysfunction in the host. These chemicals are released into the environment by design, such as in pesticides, or as a by-product of industry. Candidate pollutants include polyaromatic hydrocarbons, polybrominated biphenols, and polychlorinated biphenols, among others. Infections are also reputed to trigger autoimmunity and may act alone or in concert with environmental chemicals. We have utilized a unique animal model, the NOD.H2(h4) mouse to explore the influence of iodine and other environmental factors on autoimmune thyroiditis. PMID:19818584

  7. Suicide Triggers Described by Herodotus

    PubMed Central

    Auchincloss, Stephane; Ahmadi, Jamshid

    2016-01-01

    Objective: The aim of this study was to better understand the triggers of suicide, particularly among the ancient Greek and Persian soldiers and commanders. Method: ‘Herodotus:TheHistories’ is a history of the rulers and soldiery who participated in the Greco-Persian wars (492-449 BCE). A new translation (2013) of this manuscript was studied. Accounts of suicide were collected and collated, with descriptions of circumstances, methods, and probable triggers. Results: Nine accounts of suicide were identified. Eight of these were named individuals (4 Greeks and 4 Persians); of whom, seven were male. Only one (not the female) appeared to act in response to a mental disorder. Other triggers of suicide included guilt, avoidance of dishonour/punishment and altruism. Cutting/ stabbing was the most common method; others included hanging, jumping, poison, and burning (the single female). Conclusion: While soldiers at a time of war do not reflect the general community, they are nevertheless members of their society. Thus, this evidence demonstrates that suicide triggered by burdensome circumstances (in addition to mental disorder) was known to the Greek and Persian people more than two millennia ago. PMID:27437010

  8. Host defenses trigger salmonella's arsenal.

    PubMed

    Keestra, A Marijke; Bäumler, Andreas J

    2011-03-17

    Salmonella survives in macrophages by using a molecular syringe to deliver proteins into the host-cell cytosol where they manipulate phagocyte physiology. Arpaia and colleagues (Arpaia et al., 2011) show that deployment of this virulence factor is triggered by the very responses that are intended to confer host resistance. PMID:21402352

  9. Multiple channel data acquisition system

    DOEpatents

    Crawley, H. Bert; Rosenberg, Eli I.; Meyer, W. Thomas; Gorbics, Mark S.; Thomas, William D.; McKay, Roy L.; Homer, Jr., John F.

    1990-05-22

    A multiple channel data acquisition system for the transfer of large amounts of data from a multiplicity of data channels has a plurality of modules which operate in parallel to convert analog signals to digital data and transfer that data to a communications host via a FASTBUS. Each module has a plurality of submodules which include a front end buffer (FEB) connected to input circuitry having an analog to digital converter with cache memory for each of a plurality of channels. The submodules are interfaced with the FASTBUS via a FASTBUS coupler which controls a module bus and a module memory. The system is triggered to effect rapid parallel data samplings which are stored to the cache memories. The cache memories are uploaded to the FEBs during which zero suppression occurs. The data in the FEBs is reformatted and compressed by a local processor during transfer to the module memory. The FASTBUS coupler is used by the communications host to upload the compressed and formatted data from the module memory. The local processor executes programs which are downloaded to the module memory through the FASTBUS coupler.

  10. Multiple channel data acquisition system

    DOEpatents

    Crawley, H.B.; Rosenberg, E.I.; Meyer, W.T.; Gorbics, M.S.; Thomas, W.D.; McKay, R.L.; Homer, J.F. Jr.

    1990-05-22

    A multiple channel data acquisition system for the transfer of large amounts of data from a multiplicity of data channels has a plurality of modules which operate in parallel to convert analog signals to digital data and transfer that data to a communications host via a FASTBUS. Each module has a plurality of submodules which include a front end buffer (FEB) connected to input circuitry having an analog to digital converter with cache memory for each of a plurality of channels. The submodules are interfaced with the FASTBUS via a FASTBUS coupler which controls a module bus and a module memory. The system is triggered to effect rapid parallel data samplings which are stored to the cache memories. The cache memories are uploaded to the FEBs during which zero suppression occurs. The data in the FEBs is reformatted and compressed by a local processor during transfer to the module memory. The FASTBUS coupler is used by the communications host to upload the compressed and formatted data from the module memory. The local processor executes programs which are downloaded to the module memory through the FASTBUS coupler. 25 figs.

  11. Trigger probe for determining the orientation of the power distribution of an electron beam

    DOEpatents

    Elmer, John W.; Palmer, Todd A.; Teruya, Alan T.

    2007-07-17

    The present invention relates to a probe for determining the orientation of electron beams being profiled. To accurately time the location of an electron beam, the probe is designed to accept electrons from only a narrowly defined area. The signal produced from the probe is then used as a timing or triggering fiducial for an operably coupled data acquisition system. Such an arrangement eliminates changes in slit geometry, an additional signal feedthrough in the wall of a welding chamber and a second timing or triggering channel on a data acquisition system. As a result, the present invention improves the accuracy of the resulting data by minimizing the adverse effects of current slit triggering methods so as to accurately reconstruct electron or ion beams.

  12. [The electrical conductivity of triggered lightning channel].

    PubMed

    Zhang, Hua-ming; Yuan, Ping; Su, Mao-gen; Lü, Shi-hua

    2007-10-01

    Spectra of return strokes for artificial triggered lightning were obtained by optical multi-channel analyzer (OMA) in Shandong region. Compared with previous spectra of natural lightning, additional lines of ArI 602.5 nm and ArII 666.5 nm were observed. Under the model of local thermodynamic equilibrium, electronic temperatures of the lightning channel plasma were obtained according to the relative line intensities. Meanwhile, with semi-empirical method the electron density was obtained by Halpha line Stark broadening. In combination with plasma theory, electrical conductivity of the lightning channel has been calculated for the first time, and the characteristic of conductivity for lightning channel was also discussed. The relation between the electrical conductivity of channel and the return stroke current was analyzed, providing reference data for further work on computing return stroke current. Results show that the lightning channel is a good conductor, and electrons are the main carrier of channel current. The brightness of artificial triggered lightning channel is usually higher than that of natural lightning, and its current is smaller than that of the natural lightning. PMID:18306764

  13. A real-time digital control, data acquisition and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.; Campbell, G. L.; Carlstrom, T. N.; Deboo, J. C.; Hsieh, C.-L.; Snider, R. T.; Trost, P. K.

    1990-10-01

    A VME-based real time computer systems for laser control, data acquisition and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to 8 Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in real time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a 'burst mode', where all available (fully charged) lasers can be fired at 50 to 100 msec intervals upon receipt of an external event trigger signal. One of more cpu modules, along with a LeCroy FERA (Fast Encoding and Readout ADC) system, will perform real time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 msec following each laser pulse. The VME-based computer system consists of 2 or more target processor modules (25 MHz Motorola 68030) running the VMEexec real time operating system connected to a Unix based Host system (also a 68030). All real time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  14. A real-time digital control, data acquisition and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    SciTech Connect

    Greenfield, C.M.; Campbell, G.L.; Carlstrom, T.N.; DeBoo, J.C.; Hsieh, C.-L.; Snider, R.T.; Trost, P.K.

    1990-10-01

    A VME-based real-time computer systems for laser control, data acquisition and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to 8 Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a burst mode'', where all available (fully charged) lasers can be fired at 50--100 {mu}sec intervals upon receipt of an external event trigger signal. One of more cpu modules, along with a LeCroy FERA (Fast Encoding and Readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 msec following each laser pulse. The VME-based computer system consists of 2 or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet. 17 refs., 1 fig.

  15. Real-time digital control, data acquisition, and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    NASA Astrophysics Data System (ADS)

    Greenfield, C. M.; Campbell, G. L.; Carlstrom, T. N.; DeBoo, J. C.; Hsieh, C.-L.; Snider, R. T.; Trost, P. K.

    1990-10-01

    A VME-based real-time computer system for laser control, data acquisition, and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to eight Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in a real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a ``burst mode,'' where all available (fully charged) lasers can be fired at 50-100 μs intervals upon receipt of an external event trigger signal. One or more cpu modules, along with a LeCroy FERA (fast encoding and readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 ms following each laser pulse. The VME-based computer system consists of two or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix-based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non-real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  16. Real-time digital control, data acquisition, and analysis system for the DIII-D multipulse Thomson scattering diagnostic

    SciTech Connect

    Greenfield, C.M.; Campbell, G.L.; Carlstrom, T.N.; DeBoo, J.C.; Hsieh, C.; Snider, R.T.; Trost, P.K. )

    1990-10-01

    A VME-based real-time computer system for laser control, data acquisition, and analysis for the DIII-D multipulse Thomson scattering diagnostic is described. The laser control task requires precise timing of up to eight Nd:YAG lasers, each with an average firing rate of 20 Hz. A cpu module in a real-time multiprocessing computer system will operate the lasers with evenly staggered laser pulses or in a burst mode,'' where all available (fully charged) lasers can be fired at 50--100 {mu}s intervals upon receipt of an external event trigger signal. One or more cpu modules, along with a LeCroy FERA (fast encoding and readout ADC) system, will perform real-time data acquisition and analysis. Partial electron temperature and density profiles will be available for plasma feedback control within 1 ms following each laser pulse. The VME-based computer system consists of two or more target processor modules (25 MHz Motorola 68030) running the VMEexec real-time operating system connected to a Unix-based host system (also a 68030). All real-time software is fully interrupt driven to maximize system efficiency. Operator interaction and (non-real-time) data analysis takes place on a MicroVAX 3400 connected via DECnet.

  17. Strategic petroleum reserve data acquisition system

    SciTech Connect

    Merillat, P D; Bauer, A G

    1980-10-01

    The Strategic Petroleum Reserve Data Acquisition System is a general purpose, digital data acquisition system designed for field use in the DOE's Strategic Petroleum Reserve testing and monitoring program. The system is computer driven, under the control of an operator. The system is designed to allow the operator to perform pre-test system configuration; test monitoring and control; and post test analysis. This document is a system description and an operator users manual. Topics covered include: configuration and running on-line tests, software documentation, and maintenance programming information.

  18. Synchronization trigger control system for flow visualization

    NASA Technical Reports Server (NTRS)

    Chun, K. S.

    1987-01-01

    The use of cinematography or holographic interferometry for dynamic flow visualization in an internal combustion engine requires a control device that globally synchronizes camera and light source timing at a predefined shaft encoder angle. The device is capable of 0.35 deg resolution for rotational speeds of up to 73 240 rpm. This was achieved by implementing the shaft encoder signal addressed look-up table (LUT) and appropriate latches. The developed digital signal processing technique achieves 25 nsec of high speed triggering angle detection by using direct parallel bit comparison of the shaft encoder digital code with a simulated angle reference code, instead of using angle value comparison which involves more complicated computation steps. In order to establish synchronization to an AC reference signal whose magnitude is variant with the rotating speed, a dynamic peak followup synchronization technique has been devised. This method scrutinizes the reference signal and provides the right timing within 40 nsec. Two application examples are described.

  19. The CDF silicon vertex trigger

    SciTech Connect

    B. Ashmanskas; A. Barchiesi; A. Bardi

    2003-06-23

    The CDF experiment's Silicon Vertex Trigger is a system of 150 custom 9U VME boards that reconstructs axial tracks in the CDF silicon strip detector in a 15 {mu}sec pipeline. SVT's 35 {mu}m impact parameter resolution enables CDF's Level 2 trigger to distinguish primary and secondary particles, and hence to collect large samples of hadronic bottom and charm decays. We review some of SVT's key design features. Speed is achieved with custom VLSI pattern recognition, linearized track fitting, pipelining, and parallel processing. Testing and reliability are aided by built-in logic state analysis and test-data sourcing at each board's input and output, a common inter-board data link, and a universal ''Merger'' board for data fan-in/fan-out. Speed and adaptability are enhanced by use of modern FPGAs.

  20. Method for triggering an action

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.; Moon, Justin; Koehler, Roger O.

    2006-10-17

    A method for triggering an action of at least one downhole device on a downhole network integrated into a downhole tool string synchronized to an event comprises determining latency, sending a latency adjusted signal, and performing the action. The latency is determined between a control device and the at least one downhole device. The latency adjusted signal for triggering an action is sent to the downhole device. The action is performed downhole synchronized to the event. A preferred method for determining latency comprises the steps: a control device sends a first signal to the downhole device; after receiving the signal, the downhole device sends a response signal to the control device; and the control device analyzes the time from sending the signal to receiving the response signal.

  1. The L3 energy trigger

    NASA Astrophysics Data System (ADS)

    Bizzarri, R.; Cesaroni, F.; Gentile, S.; Lunadei, G.; Fukushima, M.; Herten, G.; Hebbeker, T.

    1989-11-01

    The L3 first-level energy trigger is based on energy measurements in electromagnetic and hadronic calorimeters and in luminosity monitors. The information from these detectors is evaluated and a decision is taken in about 20 μs (the time between two bunch crossings in LEP is 22 μs). This trigger makes use of 300 CAMAC modules: an arithmetic logic unit (ALU), a BUS multiplexer (BS), a memory lookup table (MLU), a data stack (DS) and a fast encoding and readout ADC (FERA), each of them performing dedicated functions. The data are transmitted via front-panel ECL buses. The CAMAC data-way is used only for initialization and checking purposes. The system operates synchronously with a period of 60 ns.

  2. A high speed buffer for LV data acquisition

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.; Sterlina, Patrick S.; Clemmons, James I., Jr.; Meyers, James F.

    1987-01-01

    The laser velocimeter (autocovariance) buffer interface is a data acquisition subsystem designed specifically for the acquisition of data from a laser velocimeter. The subsystem acquires data from up to six laser velocimeter components in parallel, measures the times between successive data points for each of the components, establishes and maintains a coincident condition between any two or three components, and acquires data from other instrumentation systems simultaneously with the laser velocimeter data points. The subsystem is designed to control the entire data acquisition process based on initial setup parameters obtained from a host computer and to be independent of the computer during the acquisition. On completion of the acquisition cycle, the interface transfers the contents of its memory to the host under direction of the host via a single 16-bit parallel DMA channel.

  3. Fault tolerant issues in the BTeV trigger

    SciTech Connect

    Jeffrey A. Appel et al.

    2002-12-03

    The BTeV trigger performs sophisticated computations using large ensembles of FPGAs, DSPs, and conventional microprocessors. This system will have between 5,000 and 10,000 computing elements and many networks and data switches. While much attention has been devoted to developing efficient algorithms, the need for fault-tolerant, fault-adaptive, and flexible techniques and software to manage this huge computing platform has been identified as one of the most challenging aspects of this project. They describe the problem and offer an approach to solving it based on a distributed, hierarchical fault management system.

  4. Optical Spectra of Triggered Lightning

    NASA Astrophysics Data System (ADS)

    Walker, T. D.; Biagi, C. J.; Hill, J. D.; Jordan, D. M.; Uman, M. A.; Christian, H. J., Jr.

    2009-12-01

    In August 2009, the first optical spectra of triggered lightning flashes were acquired. Data from two triggered lightning flashes were obtained at the International Center for Lightning Research and Testing in north-central Florida. The spectrometer that was used has an average dispersion of 260 Å/mm resulting in an average resolution of 5 Å when mated to a Photron (SA1.1) high-speed camera. The spectra captured with this system had a free spectral range of 3800-8000 Å. The spectra were captured at 300,000 frames per second. The spectrometer's vertical field of view was 3 m at an altitude 50 m above the launch tower, intended to view the middle of the triggering wire. Preliminary results show that the copper spectrum dominated the earliest part of the flash and copper lines persisted during the total lifetime of the detectable spectrum. Animations over the lifetime of the stroke from the initial wire illumination to multiple return strokes show the evolution of the spectrum. In addition, coordinated high speed channel base current, electric field and imagery measurements of the exploding wire, downward leaders, and return strokes were recorded. Quantitative analysis of the spectral evolution will be discussed in the context of the overall flash development.

  5. Spectrum acquisition of detonation based on CMOS

    NASA Astrophysics Data System (ADS)

    Li, Yan; Bai, Yonglin; Wang, Bo; Liu, Baiyu; Xue, Yingdong; Zhang, Wei; Gou, Yongsheng; Bai, Xiaohong; Qin, Junjun; Xian, Ouyang

    2010-10-01

    The detection of high-speed dynamic spectrum is the main method to acquire transient information. In order to obtain the large amount spectral data in real-time during the process of detonation, a CMOS-based system with high-speed spectrum data acquisition is designed. The hardware platform of the system is based on FPGA, and the unique characteristic of CMOS image sensors in the rolling shutter model is used simultaneously. Using FPGA as the master control chip of the system, not only provides the time sequence for CIS, but also controls the storage and transmission of the spectral data. In the experiment of spectral data acquisition, the acquired information is transmitted to the host computer through the CameraLink bus. The dynamic spectral curve is obtained after the subsequent processing. The experimental results demonstrate that this system is feasible in the acquisition and storage of high-speed dynamic spectrum information during the process of detonation.

  6. Implicit learning and acquisition of music.

    PubMed

    Rohrmeier, Martin; Rebuschat, Patrick

    2012-10-01

    Implicit learning is a core process for the acquisition of a complex, rule-based environment from mere interaction, such as motor action, skill acquisition, or language. A body of evidence suggests that implicit knowledge governs music acquisition and perception in nonmusicians and musicians, and that both expert and nonexpert participants acquire complex melodic, harmonic, and other features from mere exposure. While current findings and computational modeling largely support the learning of chunks, some results indicate learning of more complex structures. Despite the body of evidence, more research is required to support the cross-cultural validity of implicit learning and to show that core and more complex music theoretical features are acquired implicitly. PMID:23060126

  7. The trigger system for the external target experiment in the HIRFL cooling storage ring

    NASA Astrophysics Data System (ADS)

    Li, Min; Zhao, Lei; Liu, Jin-Xin; Lu, Yi-Ming; Liu, Shu-Bin; An, Qi

    2016-08-01

    A trigger system was designed for the external target experiment in the Cooling Storage Ring (CSR) of the Heavy Ion Research Facility in Lanzhou (HIRFL). Considering that different detectors are scattered over a large area, the trigger system is designed based on a master-slave structure and fiber-based serial data transmission technique. The trigger logic is organized in hierarchies, and flexible reconfiguration of the trigger function is achieved based on command register access or overall field-programmable gate array (FPGA) logic on-line reconfiguration controlled by remote computers. We also conducted tests to confirm the function of the trigger electronics, and the results indicate that this trigger system works well. Supported by the National Natural Science Foundation of China (11079003), the Knowledge Innovation Program of the Chinese Academy of Sciences (KJCX2-YW-N27), and the CAS Center for Excellence in Particle Physics (CCEPP).

  8. Comparison of existing and proposed HEP (High Energy Physics) data acquisition systems and their suitability for RHIC

    SciTech Connect

    Sunier, J.W.

    1987-01-01

    In this note, a summary of data acquisition systems is presented for the High Energy Physics collider facilities. Particular emphasis is made on the data acquisition stages and trigger rates. The suitability of these systems for a relativistic heavy ion collider calorimeter detector with ports is then discussed. 6 refs., 8 figs., 1 tab. (LSP)

  9. Design of a hardware track finder (Fast Tracker) for the ATLAS trigger

    NASA Astrophysics Data System (ADS)

    Cavaliere, V.; Adelman, J.; Albicocco, P.; Alison, J.; Ancu, L. S.; Anderson, J.; Andari, N.; Andreani, A.; Andreazza, A.; Annovi, A.; Antonelli, M.; Asbah, N.; Atkinson, M.; Baines, J.; Barberio, E.; Beccherle, R.; Beretta, M.; Bertolucci, F.; Biesuz, N. V.; Blair, R.; Bogdan, M.; Boveia, A.; Britzger, D.; Bryant, P.; Burghgrave, B.; Calderini, G.; Camplani, A.; Cavasinni, V.; Chakraborty, D.; Chang, P.; Cheng, Y.; Citraro, S.; Citterio, M.; Crescioli, F.; Dawe, N.; Dell'Orso, M.; Donati, S.; Dondero, P.; Drake, G.; Gadomski, S.; Gatta, M.; Gentsos, C.; Giannetti, P.; Gkaitatzis, S.; Gramling, J.; Howarth, J. W.; Iizawa, T.; Ilic, N.; Jiang, Z.; Kaji, T.; Kasten, M.; Kawaguchi, Y.; Kim, Y. K.; Kimura, N.; Klimkovich, T.; Kolb, M.; Kordas, K.; Krizka, K.; Kubota, T.; Lanza, A.; Li, H. L.; Liberali, V.; Lisovyi, M.; Liu, L.; Love, J.; Luciano, P.; Luongo, C.; Magalotti, D.; Maznas, I.; Meroni, C.; Mitani, T.; Nasimi, H.; Negri, A.; Neroutsos, P.; Neubauer, M.; Nikolaidis, S.; Okumura, Y.; Pandini, C.; Petridou, C.; Piendibene, M.; Proudfoot, J.; Rados, P.; Roda, C.; Rossi, E.; Sakurai, Y.; Sampsonidis, D.; Saxon, J.; Schmitt, S.; Schoening, A.; Shochet, M.; Shojaii, S.; Soltveit, H.; Sotiropoulou, C. L.; Stabile, A.; Swiatlowski, M.; Tang, F.; Taylor, P. T.; Testa, M.; Tompkins, L.; Vercesi, V.; Volpi, G.; Wang, R.; Watari, R.; Webster, J.; Wu, X.; Yorita, K.; Yurkewicz, A.; Zeng, J. C.; Zhang, J.; Zou, R.

    2016-02-01

    The use of tracking information at the trigger level in the LHC Run II period is crucial for the trigger and data acquisition system and will be even more so as contemporary collisions that occur at every bunch crossing will increase in Run III. The Fast TracKer is part of the ATLAS trigger upgrade project; it is a hardware processor that will provide every Level-1 accepted event (100 kHz) and within 100μs, full tracking information for tracks with momentum as low as 1 GeV . Providing fast, extensive access to tracking information, with resolution comparable to the offline reconstruction, FTK will help in precise detection of the primary and secondary vertices to ensure robust selections and improve the trigger performance.

  10. NIR daylight acquisition sensor improves mission capabilities

    NASA Astrophysics Data System (ADS)

    Chesser, Douglas E.; Vunck, Darius; Born, Terry; Axelson, Wayne; Rehder, Karl; Medrano, Robert S.

    2003-08-01

    The US Air Force Maui Space Surveillance System includes a 1.6 meter telescope located at the summit of Haleakala. This telescope has long played a key role in Space Object Identification (SOI) and other scientific research projects. The unique configuration of the 1.6m telescope and its suite of instruments make it ideally suited for high speed, extra-atmospheric Satellite and Missile tracking. However, because of the uniquely designed narrow field of the 1.6m telescope, acquisition of daytime objects presents a challenge. In the past, the 1.6 meter system relied primarily on offsite radar handoffs to provide FOV object placement. This reliance on radar based handoffs increased system operational complexity and decreased system reliability. Recognizing the value of improving mission operational availability and success the US Air Force Research Laboratory and contractor Boeing worked together to design a low cost system to improve the wide field acquisition of daylight objects. This instrument, known as the Daylight Acquisition Sensor (DAS), was developed using a COTS NIR Camera with custom NIR optics assemblies controlled with an integrated COTS embedded computer interface. The design that was implemented is a modification to the existing 0.56 meter nighttime only acquisition telescope, which now, because of the new NIR imaging sensor is capable of both daytime and nighttime acquisition. The system has been in operation for over 1 year and has significantly improved the acquisition capabilities of the 1.6m telescope while at the same time greatly reducing dependency on radar handoff. This paper discusses the design of the NIR Daylight Acquisition Sensor and some of the results from missions it has supported.

  11. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  12. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Noncommercial computer software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  13. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  14. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Noncommercial computer software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  15. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Noncommercial computer software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  16. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  17. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  18. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Noncommercial computer software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  19. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Commercial computer software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  20. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Noncommercial computer software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation...

  1. Acquisition of Comparison Constructions

    ERIC Educational Resources Information Center

    Hohaus, Vera; Tiemann, Sonja; Beck, Sigrid

    2014-01-01

    This article presents a study on the time course of the acquisition of comparison constructions. The order in which comparison constructions (comparatives, measure phrases, superlatives, degree questions, etc.) show up in English- and German-learning children's spontaneous speech is quite fixed. It is shown to be insufficiently determined by…

  2. High Speed data acquisition

    SciTech Connect

    Cooper, P.S.

    1998-02-01

    A general introduction to high Speed data acquisition system techniques in modern particle physics experiments is given. Examples are drawn from the SELEX(E781) high statistics charmed baryon production and decay experiment now taking data at Fermilab. {copyright} {ital 1998 American Institute of Physics.}

  3. Image Acquisition Context

    PubMed Central

    Bidgood, W. Dean; Bray, Bruce; Brown, Nicolas; Mori, Angelo Rossi; Spackman, Kent A.; Golichowski, Alan; Jones, Robert H.; Korman, Louis; Dove, Brent; Hildebrand, Lloyd; Berg, Michael

    1999-01-01

    Objective: To support clinically relevant indexing of biomedical images and image-related information based on the attributes of image acquisition procedures and the judgments (observations) expressed by observers in the process of image interpretation. Design: The authors introduce the notion of “image acquisition context,” the set of attributes that describe image acquisition procedures, and present a standards-based strategy for utilizing the attributes of image acquisition context as indexing and retrieval keys for digital image libraries. Methods: The authors' indexing strategy is based on an interdependent message/terminology architecture that combines the Digital Imaging and Communication in Medicine (DICOM) standard, the SNOMED (Systematized Nomenclature of Human and Veterinary Medicine) vocabulary, and the SNOMED DICOM microglossary. The SNOMED DICOM microglossary provides context-dependent mapping of terminology to DICOM data elements. Results: The capability of embedding standard coded descriptors in DICOM image headers and image-interpretation reports improves the potential for selective retrieval of image-related information. This favorably affects information management in digital libraries. PMID:9925229

  4. Data Acquisition Backend

    SciTech Connect

    Britton Jr., Charles L.; Ezell, N. Dianne Bull; Roberts, Michael

    2013-10-01

    This document is intended to summarize the development and testing of the data acquisition module portion of the Johnson Noise Thermometry (JNT) system developed at ORNL. The proposed system has been presented in an earlier report [1]. A more extensive project background including the project rationale is available in the initial project report [2].

  5. [Acquisition of arithmetic knowledge].

    PubMed

    Fayol, Michel

    2008-01-01

    The focus of this paper is on contemporary research on the number counting and arithmetical competencies that emerge during infancy, the preschool years, and the elementary school. I provide a brief overview of the evolution of children's conceptual knowledge of arithmetic knowledge, the acquisition and use of counting and how they solve simple arithmetic problems (e.g. 4 + 3). PMID:18198117

  6. Acquisitions List No. 42.

    ERIC Educational Resources Information Center

    Planned Parenthood--World Population, New York, NY. Katherine Dexter McCormick Library.

    The "Acquisitions List" of demographic books and articles is issued every two months by the Katharine Dexter McCormick Library. Divided into two parts, the first contains a list of books most recently acquired by the Library, each one annotated and also marked with the Library call number. The second part consists of a list of annotated articles,…

  7. Acquisitions List No. 43.

    ERIC Educational Resources Information Center

    Planned Parenthood--World Population, New York, NY. Katherine Dexter McCormick Library.

    The "Acquisitions List" of demographic books and articles is issued every two months by the Katharine Dexter McCormick Library. Divided into two parts, the first contains a list of books most recently acquired by the Library, each one annotated and also marked with the Library call number. The second part consists of a list of annotated articles,…

  8. Following Native Language Acquisition.

    ERIC Educational Resources Information Center

    Neiburg, Michael S.

    Native language acquisition is a natural and non-natural stage-by-stage process. The natural first stage is development of speech and listening skills. In this stage, competency is gained in the home environment. The next, non-natural stage is development of literacy, a cultural skill taught in school. Since oral-aural native language development…

  9. Telecommunications and data acquisition

    NASA Technical Reports Server (NTRS)

    Renzetti, N. A. (Editor)

    1981-01-01

    Deep Space Network progress in flight project support, tracking and data acquisition research and technology, network engineering, hardware and software implementation, and operations is reported. In addition, developments in Earth based radio technology as applied to geodynamics, astrophysics, and the radio search for extraterrestrial intelligence are reported.

  10. A 16 channel discriminator VME board with enhanced triggering capabilities

    NASA Astrophysics Data System (ADS)

    Borsato, E.; Garfagnini, A.; Menon, G.

    2012-08-01

    Electronics and data acquisition systems used in small and large scale laboratories often have to handle analog signals with varying polarity, amplitude and duration which have to be digitized to be used as trigger signals to validate the acquired data. In the specific case of experiments dealing with ionizing radiation, ancillary particle detectors (for instance plastic scintillators or Resistive Plate Chambers) are used to trigger and select the impinging particles for the experiment. A novel approach using commercial LVDS line receivers as discriminator devices is presented. Such devices, with a proper calibration, can handle positive and negative analog signals in a wide dynamic range (from 20 mV to 800 mV signal amplitude). The clear advantages, with respect to conventional discriminator devices, are reduced costs, high reliability of a mature technology and the possibility of high integration scale. Moreover, commercial discriminator boards with positive input signal and a wide threshold swing are not available on the market. The present paper describes the design and characterization of a VME board capable to handle 16 differential or single-ended input channels. The output digital signals, available independently for each input, can be combined in the board into three independent trigger logic units which provide additional outputs for the end user.

  11. A versatile digital camera trigger for telescopes in the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Schwanke, U.; Shayduk, M.; Sulanke, K.-H.; Vorobiov, S.; Wischnewski, R.

    2015-05-01

    This paper describes the concept of an FPGA-based digital camera trigger for imaging atmospheric Cherenkov telescopes, developed for the future Cherenkov Telescope Array (CTA). The proposed camera trigger is designed to select images initiated by the Cherenkov emission of extended air showers from very-high energy (VHE, E > 20 GeV) photons and charged particles while suppressing signatures from background light. The trigger comprises three stages. A first stage employs programmable discriminators to digitize the signals arriving from the camera channels (pixels). At the second stage, a grid of low-cost FPGAs is used to process the digitized signals for camera regions with 37 pixels. At the third stage, trigger conditions found independently in any of the overlapping 37-pixel regions are combined into a global camera trigger by few central FPGAs. Trigger prototype boards based on Xilinx FPGAs have been designed, built and tested and were shown to function properly. Using these components a full camera trigger with a power consumption and price per channel of about 0.5 W and 19 €, respectively, can be built. With the described design the camera trigger algorithm can take advantage of pixel information in both the space and the time domain allowing, for example, the creation of triggers sensitive to the time-gradient of a shower image; the time information could also be exploited to online adjust the time window of the acquisition system for pixel data. Combining the results of the parallel execution of different trigger algorithms (optimized, for example, for the lowest and highest energies, respectively) on each FPGA can result in a better response over all photons energies (as demonstrated by Monte Carlo simulation in this work).

  12. The first implementation of respiratory triggered 4DCBCT on a linear accelerator

    NASA Astrophysics Data System (ADS)

    O’Brien, Ricky T.; Cooper, Benjamin J.; Shieh, Chun-Chien; Stankovic, Uros; Keall, Paul J.; Sonke, Jan-Jakob

    2016-05-01

    Four dimensional cone beam computed tomography (4DCBCT) is an image guidance strategy used for patient positioning in radiotherapy. In conventional implementations of 4DCBCT, a constant gantry speed and a constant projection pulse rate are used. Unfortunately, this leads to higher imaging doses than are necessary because a large number of redundant projections are acquired. In theoretical studies, we have previously demonstrated that by suppressing redundant projections the imaging dose can be reduced by 40–50% for a majority of patients with little reduction in image quality. The aim of this study was to experimentally realise the projection suppression technique, which we have called Respiratory Triggered 4DCBCT (RT-4DCBCT). A real-time control system was developed that takes the respiratory signal as input and computes whether to acquire, or suppress, the next projection trigger during 4DCBCT acquisition. The CIRS dynamic thorax phantom was programmed with a 2 cm peak-to-peak motion and periods ranging from 2 to 8 s. Image quality was assessed by computing the edge response width of a 3 cm imaging insert placed in the phantom as well as the signal to noise ratio of the phantoms tissue and the contrast to noise ratio between the phantoms lung and tissue. The standard deviation in the superior–inferior direction of the 3 cm imaging insert was used to assess intra-phase bin displacement variations with a higher standard deviation implying more motion blur. The 4DCBCT imaging dose was reduced by 8.6%, 41%, 54%, 70% and 77% for patients with 2, 3, 4, 6 and 8 s breathing periods respectively when compared to conventional 4DCBCT. The standard deviation of the intra-phase bin displacement variation of the 3 cm imaging insert was reduced by between 13% and 43% indicating a more consistent position for the projections within respiratory phases. For the 4 s breathing period, the edge response width was reduced by 39% (0.8 mm) with only a 6–7% decrease in

  13. The first implementation of respiratory triggered 4DCBCT on a linear accelerator.

    PubMed

    O'Brien, Ricky T; Cooper, Benjamin J; Shieh, Chun-Chien; Stankovic, Uros; Keall, Paul J; Sonke, Jan-Jakob

    2016-05-01

    Four dimensional cone beam computed tomography (4DCBCT) is an image guidance strategy used for patient positioning in radiotherapy. In conventional implementations of 4DCBCT, a constant gantry speed and a constant projection pulse rate are used. Unfortunately, this leads to higher imaging doses than are necessary because a large number of redundant projections are acquired. In theoretical studies, we have previously demonstrated that by suppressing redundant projections the imaging dose can be reduced by 40-50% for a majority of patients with little reduction in image quality. The aim of this study was to experimentally realise the projection suppression technique, which we have called Respiratory Triggered 4DCBCT (RT-4DCBCT). A real-time control system was developed that takes the respiratory signal as input and computes whether to acquire, or suppress, the next projection trigger during 4DCBCT acquisition. The CIRS dynamic thorax phantom was programmed with a 2 cm peak-to-peak motion and periods ranging from 2 to 8 s. Image quality was assessed by computing the edge response width of a 3 cm imaging insert placed in the phantom as well as the signal to noise ratio of the phantoms tissue and the contrast to noise ratio between the phantoms lung and tissue. The standard deviation in the superior-inferior direction of the 3 cm imaging insert was used to assess intra-phase bin displacement variations with a higher standard deviation implying more motion blur. The 4DCBCT imaging dose was reduced by 8.6%, 41%, 54%, 70% and 77% for patients with 2, 3, 4, 6 and 8 s breathing periods respectively when compared to conventional 4DCBCT. The standard deviation of the intra-phase bin displacement variation of the 3 cm imaging insert was reduced by between 13% and 43% indicating a more consistent position for the projections within respiratory phases. For the 4 s breathing period, the edge response width was reduced by 39% (0.8 mm) with only a 6-7% decrease in the

  14. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  15. Laser-triggered vacuum switch

    DOEpatents

    Brannon, Paul J.; Cowgill, Donald F.

    1990-01-01

    A laser-triggered vacuum switch has a material such as a alkali metal halide on the cathode electrode for thermally activated field emission of electrons and ions upon interaction with a laser beam, the material being in contact with the cathode with a surface facing the discharge gap. The material is preferably a mixture of KCl and Ti powders. The laser may either shine directly on the material, preferably through a hole in the anode, or be directed to the material over a fiber optic cable.

  16. Star formation and its triggers

    NASA Astrophysics Data System (ADS)

    Combes, F.

    2016-06-01

    The relation between star formation and gas density appears linear for galaxies on the main sequence, and when the molecular gas is considered. However, the star formation efficiency (SFE) defined as the ratio of SFR to gas surface densities, can be much higher when SF is triggered by a dynamical process such as galaxy interaction or mergers, or even secular evolution and cold gas accretion. I review recent work showing how the SFE can vary as a function of morphological type, environment, or redshift. Physical processes able to explain positive and negative feedback from supernovae or AGN are discussed.

  17. Using the CMS High Level Trigger as a Cloud Resource

    NASA Astrophysics Data System (ADS)

    Colling, David; Huffman, Adam; McCrae, Alison; Lahiff, Andrew; Grandi, Claudio; Cinquilli, Mattia; Gowdy, Stephen; Coarasa, Jose Antonio; Tiradani, Anthony; Ozga, Wojciech; Chaze, Olivier; Sgaravatto, Massimo; Bauer, Daniela

    2014-06-01

    The CMS High Level Trigger is a compute farm of more than 10,000 cores. During data taking this resource is heavily used and is an integral part of the experiment's triggering system. However, outside of data taking periods this resource is largely unused. We describe why CMS wants to use the HLT as a cloud resource (outside of data taking periods) and how this has been achieved. In doing this we have turned a single-use cluster into an agile resource for CMS production computing. While we are able to use the HLT as a production cloud resource, there is still considerable further work that CMS needs to carry out before this resource can be used with the desired agility. This report, therefore, represents a snapshot of this activity at the time of CHEP 2013.

  18. Triggering for charm, beauty, and truth

    SciTech Connect

    Appel, J.A.

    1982-02-01

    As the search for more and more rare processes accelerates, the need for more and more effective event triggers also accelerates. In the earliest experiments, a simple coincidence often sufficed not only as the event trigger, but as the complete record of an event of interest. In today's experiments, not only has the fast trigger become more sophisticated, but one or more additional level of trigger processing precedes writing event data to magnetic tape for later analysis. Further search experiments will certainly require further expansion in the number of trigger levels required to filter those rare events of particular interest.

  19. The Database Driven ATLAS Trigger Configuration System

    NASA Astrophysics Data System (ADS)

    Chavez, Carlos; Gianelli, Michele; Martyniuk, Alex; Stelzer, Joerg; Stockton, Mark; Vazquez, Will

    2015-12-01

    The ATLAS trigger configuration system uses a centrally provided relational database to store the configurations for all levels of the ATLAS trigger system. The configuration used at any point during data taking is maintained in this database. A interface to this database is provided by the TriggerTool, a Java-based graphical user interface. The TriggerTool has been designed to work as both a convenient browser and editor of configurations in the database for both general users and experts. The updates to the trigger system necessitated by the upgrades and changes in both hardware and software during the first long shut down of the LHC will be explored.

  20. Triggering on electrons, jets and tau leptons with the CMS upgraded calorimeter trigger for the LHC RUN II

    NASA Astrophysics Data System (ADS)

    Zabi, A.; Beaudette, F.; Cadamuro, L.; Mastrolorenzo, L.; Romanteau, T.; Sauvan, J. B.; Strebler, T.; Marrouche, J.; Wardle, N.; Aggleton, R.; Ball, F.; Brooke, J.; Newbold, D.; Paramesvaran, S.; Smith, D.; Baber, M.; Bundock, A.; Citron, M.; Elwood, A.; Hall, G.; Iles, G.; Laner, C.; Penning, B.; Rose, A.; Tapper, A.; Durkin, T.; Harder, K.; Harper, S.; Shepherd-Themistocleous, C.; Thea, A.; Williams, T.

    2016-02-01

    The Compact Muon Solenoid (CMS) experiment has implemented a sophisticated two-level online selection system that achieves a rejection factor of nearly 105. During Run II, the LHC will increase its centre-of-mass energy up to 13 TeV and progressively reach an instantaneous luminosity of 2 × 1034 cm-2 s-1. In order to guarantee a successful and ambitious physics programme under this intense environment, the CMS Trigger and Data acquisition (DAQ) system has been upgraded. A novel concept for the L1 calorimeter trigger is introduced: the Time Multiplexed Trigger (TMT) . In this design, nine main processors receive each all of the calorimeter data from an entire event provided by 18 preprocessors. This design is not different from that of the CMS DAQ and HLT systems. The advantage of the TMT architecture is that a global view and full granularity of the calorimeters can be exploited by sophisticated algorithms. The goal is to maintain the current thresholds for calorimeter objects and improve the performance for their selection. The performance of these algorithms will be demonstrated, both in terms of efficiency and rate reduction. The callenging aspects of the pile-up mitigation and firmware design will be presented.

  1. Knowledge-Acquisition Tool For Expert System

    NASA Technical Reports Server (NTRS)

    Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.

    1988-01-01

    Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.

  2. The CTIO CCD-TV acquisition camera

    NASA Astrophysics Data System (ADS)

    Walker, Alistair R.; Schmidt, Ricardo

    A prototype CCD-TV camera has been built at CTIO, conceptually similar to the cameras in use at Lick Observatory. A GEC CCD is used as the detector, cooled thermo-electrically to -45C. Pictures are displayed via an IBM PC clone computer and an ITI image display board. Results of tests at the CTIO telescopes are discussed, including comparisons with the RCA ISIT cameras used at present for acquisition and guiding.

  3. 33. Perimeter acquisition radar building room #320, perimeter acquisition radar ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. Perimeter acquisition radar building room #320, perimeter acquisition radar operations center (PAROC), contains the tactical command and control group equipment required to control the par site. Showing spacetrack monitor console - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  4. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other...

  5. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other...

  6. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other...

  7. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other...

  8. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other...

  9. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  10. Landslides triggered by the earthquake

    SciTech Connect

    Harp, E.L.; Keefer, D.K.

    1990-01-01

    The May 2 earthquake triggered landslides numbering in the thousands. Most numerous were rockfalls and rockslides that occurred mainly on slopes steeper than 60{degree} within sandstone, siltstone, and shale units of Upper Cretaceous and Tertiary strata. Soil falls from cutbank slopes along streams were also numerous. Seven slumps in natural slopes were triggered, and minor liquefaction-induced lateral-spread failures occurred along Los Gatos Creek. Rockfalls and rockslides occurred as far as 34 km northwest, 15 km south, and 26 km southwest of the epicenter. There were few slope failures to the east of the epicenter, owing to the absence of steep slopes in that direction. Throughout the area affected, rockfalls and rockslides were concentrated on southwest-facing slopes; the failures on slopes facing in the southwest quadrant accounted for as much as 93% of all failures in some areas. Rockfalls and rockslides from ridge crests were predominantly from sandstone units. Along steeply incised canyons, however, failures in shale and siltstone units were also common. Small rockslides and soil slides occurred from cut slopes above oil-well pump pads in the oil fields; slumps were common in the outer parts of steep fill slopes of the pump pads. The distribution of seismically induced landslides throughout the entire earthquake-affected area was mapped from true-color airphotos taken on May 3, 1985.

  11. Membrane-triggered plant immunity

    PubMed Central

    Jung, Su-Jin; Lee, Hong Gil; Seo, Pil Joon

    2014-01-01

    Plants have evolved sophisticated defense mechanisms to resist pathogen invasion. Upon the pathogen recognition, the host plants activate a variety of signal transduction pathways, and one of representative defense responses is systemic acquired resistance (SAR) that provides strong immunity against secondary infections in systemic tissues. Accumulating evidence has demonstrated that modulation of membrane composition contributes to establishing SAR and disease resistance in Arabidopsis, but underlying molecular mechanisms remain to be elucidated. Here, we show that a membrane-bound transcription factor (MTF) is associated with plant responses to pathogen attack. The MTF is responsive to microbe-associated molecular pattern (MAMP)-triggered membrane rigidification at the levels of transcription and proteolytic processing. The processed nuclear transcription factor possibly regulates pathogen resistance by directly regulating PATHOGENESIS-RELATED (PR) genes. Taken together, our results suggest that pathogenic microorganisms trigger changes in physico-chemical properties of cellular membrane in plants, and the MTF conveys the membrane information to the nucleus to ensure prompt establishment of plant immunity. PMID:25763708

  12. XI UV Laser Trigger System

    SciTech Connect

    Brickeen, B.K.; Morelli, G.L.; Paiva, R.A.; Powell, C.A.; Sundvold, P.D.

    1999-01-26

    The X1 accelerator project at Sandia National Laboratory/New Mexico utilizes SF6 insulated, multi-stage, UV laser triggered gas switches. A 265 nm UV laser system was designed and built to generate eight simultaneous output pulses of 10 mJ each with a 13 nsec pulse width. A 1061 nm solid-state Nd:Cr:GSGG laser was frequency quadrupled using a two-stage doubling process. The 1061 nm fundamental laser energy was frequency doubled with a KTP crystal to 530 nm, achieving 65% conversion efficiency. The 530 nm output was frequency doubled with KD*P crystal to 265 nm, achieving conversion efficiency of 31%. The 265 nm beam pulse was split into eight parallel channels with a system of partially reflecting mirrors. Low timing jitter and stable energy output were achieved. The entire optical system was packaged into a rugged, o-ring sealed, aluminum structure 10''x19''x2.75''. The size of the electronics was 12''x8''x8''. Subsequent accelerator system requirements dictated a redesign of the triggering system for an output beam with less angular divergence. An unstable, crossed porro prism resonator was designed and incorporated into the system. The beam divergence of the redesigned system was successfully decreased to 0.97 mrad in the UV. The resulting frequency doubling efficiencies were 55% to 530 nm and 25% to 265 nm. The optical output remained at 10 mJ in each channel with an 11 nsec pulse width.

  13. Phonological acquisition of a Korean child: An acoustic study

    NASA Astrophysics Data System (ADS)

    Jun, Sun-Ah

    2005-09-01

    Studies on child phonology suggest that there exist phonological universals in the timing of phonological events and the ordering of phonological categories, but the acquisition of speech sounds is influenced by the language-specific aspects of the ambient language such as phonetics, phonology, and the frequency of the sound in child-directed speech. This study investigates a Korean child's phonological acquisition based on tape recordings of longitudinal data (from 2 months to 2 years, recorded in 1- to 2-week intervals). Special attention is given to the change in prosody and the acquisition of the Korean three-way manner contrast (fortis, aspirated, lenis). It is known that Korean fortis and aspirated obstruents trigger high pitch at vowel onset while lenis obstruents trigger low pitch [Jun (1993), (1998)]. Preliminary results suggest that fortis obstruents are acquired first, followed by aspirated, and then lenis. The segmental properties (e.g., voice onset time, breathy phonation) appropriate for the lenis category were acquired later than the pitch. In addition, unlike the universal tendencies, velar and labial consonants were acquired earlier than alveolar consonants. Factors affecting the order of acquisition, including frequency effect and perceptual salience, will be discussed.

  14. Semantic Approaches to Language Acquisition

    ERIC Educational Resources Information Center

    Stemmer, N.

    1973-01-01

    Critical evaluation of Schlesinger's theory of language acquisition as expounded in Production of Utterances and Language Acquisition'' in The Ontogenesis of Grammar'', p63-101, New York: Academic Press, 1971. (RS)

  15. Automatic carrier acquisition system

    NASA Technical Reports Server (NTRS)

    Bunce, R. C. (Inventor)

    1973-01-01

    An automatic carrier acquisition system for a phase locked loop (PLL) receiver is disclosed. It includes a local oscillator, which sweeps the receiver to tune across the carrier frequency uncertainty range until the carrier crosses the receiver IF reference. Such crossing is detected by an automatic acquisition detector. It receives the IF signal from the receiver as well as the IF reference. It includes a pair of multipliers which multiply the IF signal with the IF reference in phase and in quadrature. The outputs of the multipliers are filtered through bandpass filters and power detected. The output of the power detector has a signal dc component which is optimized with respect to the noise dc level by the selection of the time constants of the filters as a function of the sweep rate of the local oscillator.

  16. The electronics and data acquisition system for the PandaX-I dark matter experiment

    NASA Astrophysics Data System (ADS)

    Ren, X.; Chen, X.; Ji, X.; Li, S.; Lei, S.; Liu, J.; Wang, M.; Xiao, M.; Xie, P.; Yan, B.

    2016-04-01

    We describe the electronics and data acquisition system used in the first phase of the PandaX experiment—a 120 kg dual-phase liquid xenon dark matter direct detection experiment in the China Jin-Ping Underground Laboratory. This system utilized 180 channels of commercial flash ADC waveform digitizers. During the entire experimental run, the system has achieved low trigger threshold (<1 keV electron-equivalent energy) and low deadtime data acquisition.

  17. A Statistical-Physics Approach to Language Acquisition and Language Change

    NASA Astrophysics Data System (ADS)

    Cassandro, Marzio; Collet, Pierre; Galves, Antonio; Galves, Charlotte

    1999-02-01

    The aim of this paper is to explain why Statistical Physics can help understanding two related linguistic questions. The first question is how to model first language acquisition by a child. The second question is how language change proceeds in time. Our approach is based on a Gibbsian model for the interface between syntax and prosody. We also present a simulated annealing model of language acquisition, which extends the Triggering Learning Algorithm recently introduced in the linguistic literature.

  18. Advanced Data Acquisition Systems

    NASA Technical Reports Server (NTRS)

    Perotti, J.

    2003-01-01

    Current and future requirements of the aerospace sensors and transducers field make it necessary for the design and development of new data acquisition devices and instrumentation systems. New designs are sought to incorporate self-health, self-calibrating, self-repair capabilities, allowing greater measurement reliability and extended calibration cycles. With the addition of power management schemes, state-of-the-art data acquisition systems allow data to be processed and presented to the users with increased efficiency and accuracy. The design architecture presented in this paper displays an innovative approach to data acquisition systems. The design incorporates: electronic health self-check, device/system self-calibration, electronics and function self-repair, failure detection and prediction, and power management (reduced power consumption). These requirements are driven by the aerospace industry need to reduce operations and maintenance costs, to accelerate processing time and to provide reliable hardware with minimum costs. The project's design architecture incorporates some commercially available components identified during the market research investigation like: Field Programmable Gate Arrays (FPGA) Programmable Analog Integrated Circuits (PAC IC) and Field Programmable Analog Arrays (FPAA); Digital Signal Processing (DSP) electronic/system control and investigation of specific characteristics found in technologies like: Electronic Component Mean Time Between Failure (MTBF); and Radiation Hardened Component Availability. There are three main sections discussed in the design architecture presented in this document. They are the following: (a) Analog Signal Module Section, (b) Digital Signal/Control Module Section and (c) Power Management Module Section. These sections are discussed in detail in the following pages. This approach to data acquisition systems has resulted in the assignment of patent rights to Kennedy Space Center under U.S. patent # 6

  19. Acquisition Information Management system telecommunication site survey results

    SciTech Connect

    Hake, K.A.; Key, B.G.

    1993-09-01

    The Army acquisition community currently uses a dedicated, point-to-point secure computer network for the Army Material Plan Modernization (AMPMOD). It must transition to the DOD supplied Defense Secure Network 1 (DSNET1). This is one of the first networks of this size to begin the transition. The type and amount of computing resources available at individual sites may or may not meet the new network requirements. This task surveys these existing telecommunications resources available in the Army acquisition community. It documents existing communication equipment, computer hardware, associated software, and recommends appropriate changes.

  20. A Study of Multimedia Application-Based Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Shao, Jing

    2012-01-01

    The development of computer-assisted language learning (CALL) has created the opportunity for exploring the effects of the multimedia application on foreign language vocabulary acquisition in recent years. This study provides an overview the computer-assisted language learning (CALL) and detailed a developing result of CALL--multimedia. With the…

  1. MDSplus data acquisition system

    SciTech Connect

    Stillerman, J.A.; Fredian, T.W.; Klare, K.; Manduchi, G.

    1997-01-01

    MDSplus, a tree based, distributed data acquisition system, was developed in collaboration with the ZTH Group at Los Alamos National Lab and the RFX Group at CNR in Padua, Italy. It is currently in use at MIT, RFX in Padua, TCV at EPFL in Lausanne, and KBSI in South Korea. MDSplus is made up of a set of X/motif based tools for data acquisition and display, as well as diagnostic configuration and management. It is based on a hierarchical experiment description which completely describes the data acquisition and analysis tasks and contains the results from these operations. These tools were designed to operate in a distributed, client/server environment with multiple concurrent readers and writers to the data store. While usually used over a Local Area Network, these tools can be used over the Internet to provide access for remote diagnosticians and even machine operators. An interface to a relational database is provided for storage and management of processed data. IDL is used as the primary data analysis and visualization tool. IDL is a registered trademark of Research Systems Inc. {copyright} {ital 1996 American Institute of Physics.}

  2. On Shaft Data Acquisition System (OSDAS)

    NASA Technical Reports Server (NTRS)

    Pedings, Marc; DeHart, Shawn; Formby, Jason; Naumann, Charles

    2012-01-01

    On Shaft Data Acquisition System (OSDAS) is a rugged, compact, multiple-channel data acquisition computer system that is designed to record data from instrumentation while operating under extreme rotational centrifugal or gravitational acceleration forces. This system, which was developed for the Heritage Fuel Air Turbine Test (HFATT) program, addresses the problem of recording multiple channels of high-sample-rate data on most any rotating test article by mounting the entire acquisition computer onboard with the turbine test article. With the limited availability of slip ring wires for power and communication, OSDAS utilizes its own resources to provide independent power and amplification for each instrument. Since OSDAS utilizes standard PC technology as well as shared code interfaces with the next-generation, real-time health monitoring system (SPARTAA Scalable Parallel Architecture for Real Time Analysis and Acquisition), this system could be expanded beyond its current capabilities, such as providing advanced health monitoring capabilities for the test article. High-conductor-count slip rings are expensive to purchase and maintain, yet only provide a limited number of conductors for routing instrumentation off the article and to a stationary data acquisition system. In addition to being limited to a small number of instruments, slip rings are prone to wear quickly, and introduce noise and other undesirable characteristics to the signal data. This led to the development of a system capable of recording high-density instrumentation, at high sample rates, on the test article itself, all while under extreme rotational stress. OSDAS is a fully functional PC-based system with 48 channels of 24-bit, high-sample-rate input channels, phase synchronized, with an onboard storage capacity of over 1/2-terabyte of solid-state storage. This recording system takes a novel approach to the problem of recording multiple channels of instrumentation, integrated with the test

  3. The Data Acquisition of the MAGIC Telescope

    NASA Astrophysics Data System (ADS)

    Goebel, F.; Coarasa, J. A.; Stiehler, R.; Volkov, S.; MAGIC Collaboration

    2003-07-01

    The data acquisition system of the MAGIC telescope processes the Cherenkov signals registered in the high resolution camera consisting of 577 PMTs. The analog signals are transmitted via optical fibers to the electronics hut where they are stretched, split into high and low gain channels and digitized with 300 MHz 8 bit Flash ADCs. The digital data is read out by a multipro cessor PC which saves it to a RAID system and a tap e library. The system has been designed to process data at a rate of 20 MBytes/sec which is required by the maximum envisaged trigger rate of 1 kHz. Tests of the complete readout chain show that the achieved dynamic range is more than 1000.

  4. The Acquisition of Colour Terms.

    ERIC Educational Resources Information Center

    Andrich, Gail Rex; Tager-Flusberg, Helen

    1986-01-01

    Reports two studies which investigated the acquisition of color terms by preschool children. The first was designed to clarify the role of certain conceptual factors in the acquisition of color terms. The second explored how input may interact with these conceptual factors and help to guide the acquisition of color words. (SED)

  5. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  6. What triggers coronal mass ejections ?

    NASA Astrophysics Data System (ADS)

    Aulanier, Guillaume

    Coronal mass ejections (CMEs) are large clouds of highly magnetized plasma. They are ac-celerated from the solar atmosphere into interplanetary space by the Lorentz force, which is associated to their strong current-carrying magnetic fields. Both theory and observations lead to the inevitable conclusion that the launch of a CME must result from the sudden release of free magnetic energy, which has slowly been accumulated in the corona for a long time before the eruption. Since the incomplete, but seminal, loss-of-equilibrium model was proposed by van Tend and Kuperus (1978), a large variety of analytical and numerical storage-and-release MHD models has been put forward in the past 20 years or so. All these models rely on the slow increase of currents and/or the slow decrease of the restraining magnetic tension preceding the eruption. But they all put the emphazis on different physical mechanisms to achieve this preeruptive evolution, and to suddenly trigger and later drive a CME. Nevertheless, all these models actually share many common features, which all describe many individual observed aspects of solar eruptions. It is therefore not always clear which of all the suggested mecha-nisms do really account for the triggering of observed CMEs in general. Also, these mechanisms should arguably not be as numerous as the models themselves, owing to the common occurence of CMEs. In order to shed some light on this challenging, but unripe, topic, I will attempt to rediscuss the applicability of the models to the Sun, and to rethink the most sensitive ones in a common frame, so as to find their common denominator. I will elaborate on the idea that many of the proposed triggering mechanisms may actually only be considered as different ways to apply a "last push", which puts the system beyond its eruptive threshold. I will argue that, in most cases, the eruptive threshold is determined by the vertical gradient of the magnetic field in the low-β corona, just like the usual

  7. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  8. CDF level 2 trigger upgrade

    SciTech Connect

    Anikeev, K.; Bogdan, M.; DeMaat, R.; Fedorko, W.; Frisch, H.; Hahn, K.; Hakala, M.; Keener, P.; Kim, Y.; Kroll, J.; Kwang, S.; Lewis, J.; Lin, C.; Liu, T.; Marjamaa, F.; Mansikkala, T.; Neu, C.; Pitkanen, S.; Reisert, B.; Rusu, V.; Sanders, H.; /Fermilab /Chicago U. /Pennsylvania U.

    2006-01-01

    We describe the new CDF Level 2 Trigger, which was commissioned during Spring 2005. The upgrade was necessitated by several factors that included increased bandwidth requirements, in view of the growing instantaneous luminosity of the Tevatron, and the need for a more robust system, since the older system was reaching the limits of maintainability. The challenges in designing the new system were interfacing with many different upstream detector subsystems, processing larger volumes of data at higher speed, and minimizing the impact on running the CDF experiment during the system commissioning phase. To meet these challenges, the new system was designed around a general purpose motherboard, the PULSAR, which is instrumented with powerful FPGAs and modern SRAMs, and which uses mezzanine cards to interface with upstream detector components and an industry standard data link (S-LINK) within the system.

  9. Is osseointegration inflammation-triggered?

    PubMed

    Vitkov, Ljubomir; Hartl, Dominik; Hannig, Matthias

    2016-08-01

    Bioinert endosteal implants cause a foreign body reaction, whereas bioactive ones cause osseointegration. However, the mechanisms responsible for the two modi of host response remain unclear. COX-2(-/-) animal models showed the dependence of osseointegration on prostaglandins. PGE2, a product of COX-2, augments Wnt signalling, a pathway that promotes the regeneration in many types of tissues. Recently, we demonstrated the ability of bioactive implants to recruit neutrophils and to trigger neutrophil extracellular traps (NETs), which are a potent source of PGE2. In bioinert implants no PGE2 release has been ascertained. Collectively, these findings suggest that osseointegration might be the host response to bioactive implants, novel and quite different to the so-called foreign body reaction. PMID:27372846

  10. Acoustic properties of triggered lightning

    NASA Astrophysics Data System (ADS)

    Dayeh, M. A.; Evans, N.; Ramaekers, J.; Trevino, J.; Rassoul, H.; Lucia, R. J.; Dwyer, J. R.; Uman, M. A.; Jordan, D. M.

    2014-12-01

    Acoustic signatures from rocket-triggered lightning are measured by a 15m long, one-dimensional microphone array consisting of 16 receivers situated 90 meters from the lightning channel. Measurements were taken at the International Center for Lightning Research and Testing (ICLRT) in Camp Blanding, FL during the summer of 2014. The linear array was oriented in an end-fire position so that the peak acoustic reception pattern can be steered vertically along the channel with a frequency-dependent spatial resolution, enabling us to sample the acoustic signatures from different portions along the lightning channel. We report on the characteristics of acoustic signatures associated with several return strokes in 6 measured flashes (total of 29 return strokes). In addition, we study the relationship between the amplitude, peak frequency, and inferred energy input of each stroke acoustic signature and the associated measured lightning parameters. Furthermore, challenges of obtaining acoustic measurements in thunderstorm harsh conditions and their countermeasures will also be discussed.

  11. A Mechanochemically Triggered "Click" Catalyst.

    PubMed

    Michael, Philipp; Binder, Wolfgang H

    2015-11-16

    "Click" chemistry represents one of the most powerful approaches for linking molecules in chemistry and materials science. Triggering this reaction by mechanical force would enable site- and stress-specific "click" reactions--a hitherto unreported observation. We introduce the design and realization of a homogeneous Cu catalyst able to activate through mechanical force when attached to suitable polymer chains, acting as a lever to transmit the force to the central catalytic system. Activation of the subsequent copper-catalyzed "click" reaction (CuAAC) is achieved either by ultrasonication or mechanical pressing of a polymeric material, using a fluorogenic dye to detect the activation of the catalyst. Based on an N-heterocyclic copper(I) carbene with attached polymeric chains of different flexibility, the force is transmitted to the central catalyst, thereby activating a CuAAC in solution and in the solid state. PMID:26420664

  12. Tail reconnection triggering substorm onset.

    PubMed

    Angelopoulos, Vassilis; McFadden, James P; Larson, Davin; Carlson, Charles W; Mende, Stephen B; Frey, Harald; Phan, Tai; Sibeck, David G; Glassmeier, Karl-Heinz; Auster, Uli; Donovan, Eric; Mann, Ian R; Rae, I Jonathan; Russell, Christopher T; Runov, Andrei; Zhou, Xu-Zhi; Kepko, Larry

    2008-08-15

    Magnetospheric substorms explosively release solar wind energy previously stored in Earth's magnetotail, encompassing the entire magnetosphere and producing spectacular auroral displays. It has been unclear whether a substorm is triggered by a disruption of the electrical current flowing across the near-Earth magnetotail, at approximately 10 R(E) (R(E): Earth radius, or 6374 kilometers), or by the process of magnetic reconnection typically seen farther out in the magnetotail, at approximately 20 to 30 R(E). We report on simultaneous measurements in the magnetotail at multiple distances, at the time of substorm onset. Reconnection was observed at 20 R(E), at least 1.5 minutes before auroral intensification, at least 2 minutes before substorm expansion, and about 3 minutes before near-Earth current disruption. These results demonstrate that substorms are likely initiated by tail reconnection. PMID:18653845

  13. Bars Triggered By Galaxy Flybys

    NASA Astrophysics Data System (ADS)

    Holley-Bockelmann, Kelly; Lang, Meagan; Sinha, Manodeep

    2015-05-01

    Galaxy mergers drive galaxy evolution and are a key mechanism by which galaxies grow and transform. Unlike galaxy mergers where two galaxies combine into one remnant, galaxy flybys occur when two independent galaxy halos interpenetrate but detach at a later time; these one-time events are surprisingly common and can even out-number galaxy mergers at low redshift for massive halos. Although these interactions are transient and occur far outside the galaxy disk, flybys can still drive a rapid and large pertubations within both the intruder and victim halos. We explored how flyby encounters can transform each galaxy using a suite of N-body simulations. We present results from three co-planar flybys between disk galaxies, demonstrating that flybys can both trigger strong bar formation and can spin-up dark matter halos.

  14. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  15. Applications of advanced data analysis and expert system technologies in the ATLAS Trigger-DAQ Controls framework

    NASA Astrophysics Data System (ADS)

    Avolio, G.; Corso Radu, A.; Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-12-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment is a very complex distributed computing system, composed of more than 20000 applications running on more than 2000 computers. The TDAQ Controls system has to guarantee the smooth and synchronous operations of all the TDAQ components and has to provide the means to minimize the downtime of the system caused by runtime failures. During data taking runs, streams of information messages sent or published by running applications are the main sources of knowledge about correctness of running operations. The huge flow of operational monitoring data produced is constantly monitored by experts in order to detect problems or misbehaviours. Given the scale of the system and the rates of data to be analyzed, the automation of the system functionality in the areas of operational monitoring, system verification, error detection and recovery is a strong requirement. To accomplish its objective, the Controls system includes some high-level components which are based on advanced software technologies, namely the rule-based Expert System and the Complex Event Processing engines. The chosen techniques allow to formalize, store and reuse the knowledge of experts and thus to assist the shifters in the ATLAS control room during the data-taking activities.

  16. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  17. Landslide triggering by rain infiltration

    USGS Publications Warehouse

    Iverson, Richard M.

    2000-01-01

    Landsliding in response to rainfall involves physical processes that operate on disparate timescales. Relationships between these timescales guide development of a mathematical model that uses reduced forms of Richards equation to evaluate effects of rainfall infiltration on landslide occurrence, timing, depth, and acceleration in diverse situations. The longest pertinent timescale is A/D0, where D0 is the maximum hydraulic diffusivity of the soil and A is the catchment area that potentially affects groundwater pressures at a prospective landslide slip surface location with areal coordinates x, y and depth H. Times greater than A/D0 are necessary for establishment of steady background water pressures that develop at (x, y, H) in response to rainfall averaged over periods that commonly range from days to many decades. These steady groundwater pressures influence the propensity for landsliding at (x, y, H), but they do not trigger slope failure. Failure results from rainfall over a typically shorter timescale H2/D0 associated with transient pore pressure transmission during and following storms. Commonly, this timescale ranges from minutes to months. The shortest timescale affecting landslide responses to rainfall is √(H/g), where g is the magnitude of gravitational acceleration. Postfailure landslide motion occurs on this timescale, which indicates that the thinnest landslides accelerate most quickly if all other factors are constant. Effects of hydrologic processes on landslide processes across these diverse timescales are encapsulated by a response function, R(t*) = √(t*/π) exp (-1/t*) - erfc (1/√t*), which depends only on normalized time, t*. Use of R(t*) in conjunction with topographic data, rainfall intensity and duration information, an infinite-slope failure criterion, and Newton's second law predicts the timing, depth, and acceleration of rainfall-triggered landslides. Data from contrasting landslides that exhibit rapid, shallow motion and slow, deep

  18. Diet and Dermatitis: Food Triggers

    PubMed Central

    Schlichte, Megan

    2014-01-01

    Given increasing awareness of the link between diet and health, many patients are concerned that dietary factors may trigger dermatitis. Research has found that dietary factors can indeed exacerbate atopic dermatitis or cause dermatitis due to systemic contact dermatitis. In atopic dermatitis, dietary factors are more likely to cause an exacerbation among infants or children with moderate-to-severe atopic dermatitis relative to other populations. Foods may trigger rapid, immunoglobulin E-mediated hypersensitivity reactions or may lead to late eczematous reactions. While immediate reactions occur within minutes to hours of food exposure, late eczematous reactions may occur anywhere from hours to two days later. Screening methods, such as food allergen-specific serum immunoglobulin E tests or skin prick tests, can identify sensitization to specific foods, but a diagnosis of food allergy requires specific signs and symptoms that occur reproducibly upon food exposure. Many patients who are sensitized will not develop clinical findings upon food exposure; therefore, these tests may result in false-positive tests for food allergy. This is why the gold standard for diagnosis remains the double-blind, placebo-controlled food challenge. In another condition, systemic contact dermatitis, ingestion of a specific food can actually cause dermatitis. Systemic contact dermatitis is a distinct T-cell mediated immunological reaction in which dietary exposure to specific allergens results in dermatitis. Balsam of Peru and nickel are well-known causes of systemic contact dermatitis, and reports have implicated multiple other allergens. This review seeks to increase awareness of important food allergens, elucidate their relationship with atopic dermatitis and systemic contact dermatitis, and review available diagnostic and treatment strategies. PMID:24688624

  19. A novel calorimeter trigger concept: The jet trigger of the H1 experiment at HERA

    NASA Astrophysics Data System (ADS)

    Olivier, Bob; Dubak-Behrendt, Ana; Kiesling, Christian; Reisert, Burkard; Aktas, Adil; Antunovic, Biljana; Bracinik, Juraj; Braquet, Charles; Brettel, Horst; Dulny, Barbara; Fent, Jürgen; Fras, Markus; Fröchtenicht, Walter; Haberer, Werner; Hoffmann, Dirk; Modjesch, Miriam; Placakyte, Ringaile; Schörner-Sadenius, Thomas; Wassatsch, Andreas; Zimmermann, Jens

    2011-06-01

    We report on a novel trigger for the liquid argon calorimeter which was installed in the H1 Experiment at HERA. This trigger, called the "Jet Trigger", was running at level 1 and implemented a real-time cluster algorithm. Within only 800 ns, the Jet Trigger algorithm found local energy maxima in the calorimeter, summed their immediate neighbors, sorted the resulting jets by energy, and applied topological conditions for the final level 1 trigger decision. The Jet Trigger was in operation from the year 2006 until the end of the HERA running in the summer of 2007. With the Jet Trigger it was possible to substantially reduce the thresholds for triggering on electrons and jets, giving access to a largely extended phase space for physical observables which could not have been reached in H1 before. The concepts of the Jet Trigger may be an interesting upgrade option for the LHC experiments.

  20. Disaster triggers disaster: Earthquake triggering by tropical cyclones

    NASA Astrophysics Data System (ADS)

    Wdowinski, S.; Tsukanov, I.

    2011-12-01

    Three recent devastating earthquakes, the 1999 M=7.6 Chi-Chi (Taiwan), 2010 M=7.0 Leogane (Haiti), 2010 M=6.4 Kaohsiung (Taiwan), and additional three moderate size earthquakes (66 earthquake that occurred in the central mountainous area of Taiwan within three years after the typhoon. The 2009 Morakot typhoon was followed by 2009 M=6.2 Nantou and 2010 M=6.4 Kaohsiung earthquakes; the 1969 Flossie typhoon was followed by an M=6.3 earthquake in 1972; and the 1996 Herb typhoon by the 1998 M=6.2 Rueyli and 1999 M=7.6 Chi-Chi earthquakes. The earthquake catalog of Taiwan lists only two other M>6 main-shocks that occurred in Taiwan's central mountainous belt, one of them was in 1964 only four months after the wet Typhoon Gloria poured heavy rain in the same area. We suggest that the close proximity in time and space between wet tropical cyclones and earthquakes reflects a physical link between the two hazard types in which these earthquakes were triggered by rapid erosion induced by tropical cyclone's heavy rain. Based on remote sensing observations, meshfree finite element modeling, and Coulomb failure stress analysis, we show that the

  1. Smart trigger logic for focal plane arrays

    SciTech Connect

    Levy, James E; Campbell, David V; Holmes, Michael L; Lovejoy, Robert; Wojciechowski, Kenneth; Kay, Randolph R; Cavanaugh, William S; Gurrieri, Thomas M

    2014-03-25

    An electronic device includes a memory configured to receive data representing light intensity values from pixels in a focal plane array and a processor that analyzes the received data to determine which light values correspond to triggered pixels, where the triggered pixels are those pixels that meet a predefined set of criteria, and determines, for each triggered pixel, a set of neighbor pixels for which light intensity values are to be stored. The electronic device also includes a buffer that temporarily stores light intensity values for at least one previously processed row of pixels, so that when a triggered pixel is identified in a current row, light intensity values for the neighbor pixels in the previously processed row and for the triggered pixel are persistently stored, as well as a data transmitter that transmits the persistently stored light intensity values for the triggered and neighbor pixels to a data receiver.

  2. The BTeV trigger: Recent developments

    SciTech Connect

    Kasper, Penelope; /Fermilab

    2003-12-01

    BTeV is a collider experiment at the Fermilab Tevatron dedicated to precision measurements of CP violation, mixing and rare decays of beauty and charm hadrons. The detector is a forward spectrometer with a pixel vertex detector inside a dipole magnet. A unique feature of BTeV is the trigger, which reconstructs tracks and vertices in every beam crossing. They present here an overview of the BTeV trigger and a description of recent improvements in trigger timing.

  3. Turbojet blade vibration data acquisition design and feasibility testing

    NASA Technical Reports Server (NTRS)

    Frarey, J. L.; Petersen, N. J.; Hess, D. A.

    1978-01-01

    A turbojet blade vibration data acquisition system was designed to allow the measurement of blade vibration. The data acquisition system utilizing 96 microprocessors to gather data from optical probes, store, sort and transmit to the central computer is described. Areas of high technical risk were identified and a two-microprocessor system was breadboarded and tested to investigate these areas. Results show that the system was feasible and that low technical risk would be involved in proceeding with the complete system fabrication.

  4. Synthetic aperture radar operator tactical target acquisition research

    NASA Technical Reports Server (NTRS)

    Hershberger, M. L.; Craig, D. W.

    1978-01-01

    A radar target acquisition research study was conducted to access the effects of two levels of 13 radar sensor, display, and mission parameters on operator tactical target acquisition. A saturated fractional-factorial screening design was employed to examine these parameters. Data analysis computed ETA squared values for main and second-order effects for the variables tested. Ranking of the research parameters in terms of importance to system design revealed four variables (radar coverage, radar resolution/multiple looks, display resolution, and display size) accounted for 50 percent of the target acquisition probability variance.

  5. Performance and upgrade of the CMS electron and photon trigger for Run 2

    NASA Astrophysics Data System (ADS)

    Sauvan, Jean-Baptiste; CMS Collaboration

    2015-02-01

    The CMS experiment implements a sophisticated two-level online trigger selection system that achieves a rejection factor of nearly 105. The first level (L1) trigger is based on coarse information coming from the calorimeters and the muon detectors while the high-level trigger combines fine-grain information from all sub-detectors. In the near future the LHC will increase its centre of mass energy to 13 TeV and progressively reach an instantaneous luminosity of 2 × 1034 cm-2s-1. In order to guarantee a successful and ambitious physics program under this challenging environment, the CMS trigger and data acquisition system must be consolidated. In particular the calorimeter L1 trigger hardware and architecture will be changed. The aim is to maintain the current thresholds and improve performance. This programme will be achieved using the μTCA architecture with fast optical links and latest generation FPGAs. Sophisticated object reconstruction algorithms, as well as online pile-up corrections, are being developed that will make use of these new hardware capabilities. For electron and photon reconstruction and identification, the first version of the new algorithms has been tested against the current algorithms. It shows a reduction of the trigger rate by a factor of two for isolated objects, an improved energy resolution of about 30%, and a position resolution reduced by more than four.

  6. Late Mitochondrial Acquisition, Really?

    PubMed Central

    Degli Esposti, Mauro

    2016-01-01

    This article provides a timely critique of a recent Nature paper by Pittis and Gabaldón that has suggested a late origin of mitochondria in eukaryote evolution. It shows that the inferred ancestry of many mitochondrial proteins has been incorrectly assigned by Pittis and Gabaldón to bacteria other than the aerobic proteobacteria from which the ancestor of mitochondria originates, thereby questioning the validity of their suggestion that mitochondrial acquisition may be a late event in eukaryote evolution. The analysis and approach presented here may guide future studies to resolve the true ancestry of mitochondria. PMID:27289097

  7. Ischemic Compression After Trigger Point Injection Affect the Treatment of Myofascial Trigger Points

    PubMed Central

    Kim, Soo A; Oh, Ki Young; Choi, Won Hyuck

    2013-01-01

    Objective To investigate the effects of trigger point injection with or without ischemic compression in treatment of myofascial trigger points in the upper trapezius muscle. Methods Sixty patients with active myofascial trigger points in upper trapezius muscle were randomly divided into three groups: group 1 (n=20) received only trigger point injections, group 2 (n=20) received trigger point injections with 30 seconds of ischemic compression, and group 3 (n=20) received trigger point injections with 60 seconds of ischemic compression. The visual analogue scale, pressure pain threshold, and range of motion of the neck were assessed before treatment, immediately after treatment, and 1 week after treatment. Korean Neck Disability Indexes were assessed before treatment and 1 week after treatment. Results We found a significant improvement in all assessment parameters (p<0.05) in all groups. But, receiving trigger point injections with ischemic compression group showed significant improvement as compared with the receiving only trigger point injections group. And no significant differences between receiving 30 seconds of ischemic compression group and 60 seconds of ischemic compression group. Conclusion This study demonstrated the effectiveness of ischemic compression for myofascial trigger point. Trigger point injections combined with ischemic compression shows better effects on treatment of myofascial trigger points in the upper trapezius muscle than the only trigger point injections therapy. But the duration of ischemic compression did not affect treatment of myofascial trigger point. PMID:24020035

  8. TARA control, data acquisition and analysis system

    SciTech Connect

    Gaudreau, M.P.J.; Sullivan, J.D.; Fredian, T.W.; Karcher, C.A.; Sevillano, E.; Stillerman, J.; Thomas, P.

    1983-12-01

    The MIT tandem mirror (TARA) control, data acquisition and analysis system consists of two major parts: (1) a Gould 584 industrial programmable controller (PC) to control engineering functions; and (2) a VAX 11/750 based data acquisition and analysis system for physics analysis. The PC is designed for use in harsh industrial environments and has proven to be a reliable and cost-effective means for automated control. The PC configuration is dedicated to control tasks on the TARA magnet, vacuum, RF, neutral beam, diagnostics, and utility systems. The data transfer functions are used to download system operating parameters from menu selectable tables. Real time status reports are provided to video terminals and as blocks of data to the host computer for storage. The data acquisition and analysis system for TARA was designed to provide high throughput and ready access to data from earlier runs. The adopted design uses pre-existing software packages in a system which is simple, coherent, fast, and which requires a minimum of new software development. The computer configuration is a VAX 11/750 running VMS with 124 M byte massbus disk and 1.4 G byte unibus disk subsystem.

  9. Open, reconfigurable cytometric acquisition system: ORCAS.

    PubMed

    Naivar, Mark A; Parson, Jimmie D; Wilder, Mark E; Habbersett, Robert C; Edwards, Bruce S; Sklar, Larry; Nolan, John P; Graves, Steven W; Martin, John C; Jett, James H; Freyer, James P

    2007-11-01

    A digital signal processing (DSP)-based digital data acquisition system has been developed to support novel flow cytometry efforts. The system flexibility includes how it detects, captures, and processes event data. Custom data capture boards utilizing analog to digital converters (ADCs) and field programmable gate arrays (FPGA) detect events and capture correlated event data. A commercial DSP board processes the captured data and sends the results over the IEEE 1394 bus to the host computer that provides a user interface for acquisition, display, analysis, and storage. The system collects list mode data, correlated pulse shapes, or streaming data from a variety of detector types using Linux, Mac OS X, and Windows host computers. It extracts pulse features not found on commercial systems with excellent sensitivity and linearity over a wide dynamic range. List mode data are saved in FCS 3.0 formatted files while streaming or correlated waveform data are saved in custom format files for postprocessing. Open, reconfigurable cytometric acquisition system is compact, scaleable, flexible, and modular. Programmable feature extraction algorithms have exciting possibilities for both new and existing applications. The recent availability of a commercial data capture board will enable general availability of similar systems. PMID:17680705

  10. Application of Trigger Counter Board for Synchronized Data

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiroki; Kawase, Masato; Ouchi, Noboru

    For realizing the synchronization of the measurement data taken in J-PARC Linac and RCS to achieve the advanced monitoring, data acquisition and operation, a new device Trigger Counter Board (TCB) was developed to be incorporated in the control system based on EPICS. The TCB has a capability to synchronize the data taken by the Wave Endless Recorder systems and also the data (typically beam position monitor data) shared through the Reflective Memory systems, using the information provided by the Timing System. The synchronized data over the different accelerator subsystems have contributed to the success of the quick and efficient commissioning of the J-PARC RCS, and the TCB is the indispensable tool to conduct the future upgrades of the machines.

  11. Nonlinear dynamical triggering of slow slip

    SciTech Connect

    Johnson, Paul A; Knuth, Matthew W; Kaproth, Bryan M; Carpenter, Brett; Guyer, Robert A; Le Bas, Pierre - Yves; Daub, Eric G; Marone, Chris

    2010-12-10

    Among the most fascinating, recent discoveries in seismology have been the phenomena of triggered slip, including triggered earthquakes and triggered-tremor, as well as triggered slow, silent-slip during which no seismic energy is radiated. Because fault nucleation depths cannot be probed directly, the physical regimes in which these phenomena occur are poorly understood. Thus determining physical properties that control diverse types of triggered fault sliding and what frictional constitutive laws govern triggered faulting variability is challenging. We are characterizing the physical controls of triggered faulting with the goal of developing constitutive relations by conducting laboratory and numerical modeling experiments in sheared granular media at varying load conditions. In order to simulate granular fault zone gouge in the laboratory, glass beads are sheared in a double-direct configuration under constant normal stress, while subject to transient perturbation by acoustic waves. We find that triggered, slow, silent-slip occurs at very small confining loads ({approx}1-3 MPa) that are smaller than those where dynamic earthquake triggering takes place (4-7 MPa), and that triggered slow-slip is associated with bursts of LFE-like acoustic emission. Experimental evidence suggests that the nonlinear dynamical response of the gouge material induced by dynamic waves may be responsible for the triggered slip behavior: the slip-duration, stress-drop and along-strike slip displacement are proportional to the triggering wave amplitude. Further, we observe a shear-modulus decrease corresponding to dynamic-wave triggering relative to the shear modulus of stick-slips. Modulus decrease in response to dynamical wave amplitudes of roughly a microstrain and above is a hallmark of elastic nonlinear behavior. We believe that the dynamical waves increase the material non-affine elastic deformation during shearing, simultaneously leading to instability and slow-slip. The inferred

  12. 48 CFR 873.105 - Acquisition planning.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Acquisition planning. 873.105 Section 873.105 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS DEPARTMENT SUPPLEMENTARY REGULATIONS SIMPLIFIED ACQUISITION PROCEDURES FOR HEALTH-CARE RESOURCES 873.105 Acquisition planning. (a) Acquisition planning is...

  13. 39 CFR 777.41 - Acquisition procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 39 Postal Service 1 2011-07-01 2011-07-01 false Acquisition procedures. 777.41 Section 777.41... ACQUISITION POLICIES Voluntary Acquisitions § 777.41 Acquisition procedures. (a) Voluntary Acquisitions... under § 777.32 of this part, if the acquisition were by eminent domain or the under threat thereof,...

  14. 39 CFR 777.41 - Acquisition procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 39 Postal Service 1 2013-07-01 2013-07-01 false Acquisition procedures. 777.41 Section 777.41... ACQUISITION POLICIES Voluntary Acquisitions § 777.41 Acquisition procedures. (a) Voluntary Acquisitions... under § 777.32 of this part, if the acquisition were by eminent domain or the under threat thereof,...

  15. 39 CFR 777.41 - Acquisition procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Acquisition procedures. 777.41 Section 777.41... ACQUISITION POLICIES Voluntary Acquisitions § 777.41 Acquisition procedures. (a) Voluntary Acquisitions... under § 777.32 of this part, if the acquisition were by eminent domain or the under threat thereof,...

  16. 39 CFR 777.41 - Acquisition procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 39 Postal Service 1 2012-07-01 2012-07-01 false Acquisition procedures. 777.41 Section 777.41... ACQUISITION POLICIES Voluntary Acquisitions § 777.41 Acquisition procedures. (a) Voluntary Acquisitions... under § 777.32 of this part, if the acquisition were by eminent domain or the under threat thereof,...

  17. 39 CFR 777.41 - Acquisition procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 39 Postal Service 1 2014-07-01 2014-07-01 false Acquisition procedures. 777.41 Section 777.41... ACQUISITION POLICIES Voluntary Acquisitions § 777.41 Acquisition procedures. (a) Voluntary Acquisitions... under § 777.32 of this part, if the acquisition were by eminent domain or the under threat thereof,...

  18. Global Trigger Upgrade firmware architecture for the level-1 Trigger of the CMS experiment

    NASA Astrophysics Data System (ADS)

    Rahbaran, B.; Arnold, B.; Bergauer, H.; Wittmann, J.; Matsushita, T.

    2015-02-01

    The Global Trigger (GT) is the final step of the CMS Level-1 Trigger and implements the ``menu'' of triggers, which is a set of selection requirements applied to the final list of objects (such as muons, electrons or jets) to trigger the readout of the detector and serve as basis for further calculations by the High Level Trigger. Operational experience in developing trigger menus from the first LHC run has shown that the requirements increased as the luminosity and pile-up increased. The new GT (μGT) is designed based on Xilinx Virtex-7 FPGAs, which combine unsurpassed flexibility with regard to scalability and high robustness. Furthermore, a custom board which receives signals from legacy electronics and basic binary inputs from less complex trigger sources is presented. Additionally, this paper describes the architecture of a distributed testing framework and the Trigger Menu Editor.

  19. Prompt trigger primitives for a self-seeded track trigger

    NASA Astrophysics Data System (ADS)

    Dressanandt, N.; Halgeri, A.; Kamat, M.; Koppal, V.; Newcomer, M.

    2012-10-01

    A viable self-seeded track trigger for a high rate collider detector environment must have excellent angular precision, response times commensurate with beam crossing rate and low mass. We have designed a fast clustering block servicing 128 contiguous strips to be included in an LHC upgrade silicon strip front end ASIC (ABC130) with these objectives in mind. The block is based on the presence of an analog front end with binary (threshold determined) strip readout latched at each beam crossing. Combinatorial logic tests for the presence of one or two adjacent strips over threshold, a qualifying cluster, at each beam crossing and transmits up to two, eight bits clusters descriptors, specifying address and cluster width via a high speed LVDS output. It is envisioned that a correlator chip, presently in conception, receives this data and via look-up tables checks for coincident hits between silicon strip layers. Since the clustering output will report the presence of one or two hit strips, a half strip pitch ( ~ 40 um for the ATLAS detector) resolution may be possible for each cluster. Our timing results show that the combinatorial clustering logic will settle within 6 ns. Assuming a beam crossing rate of 40 MHz, 16 bits of serialized data can be shifted out at 640MHz each crossing. This will allow a beam synchronous update rate providing data for up to two clusters for each bank of 128 strips. The data latency into the correlator chip will be only two crossings. Present power estimates suggest that the fast cluster block with LVDS driver will consume less than 12 mW.

  20. Intrasaccadic perception triggers pupillary constriction.

    PubMed

    Mathôt, Sebastiaan; Melmi, Jean-Baptiste; Castet, Eric

    2015-01-01

    It is commonly believed that vision is impaired during saccadic eye movements. However, here we report that some visual stimuli are clearly visible during saccades, and trigger a constriction of the eye's pupil. Participants viewed sinusoid gratings that changed polarity 150 times per second (every 6.67 ms). At this rate of flicker, the gratings were perceived as homogeneous surfaces while participants fixated. However, the flickering gratings contained ambiguous motion: rightward and leftward motion for vertical gratings; upward and downward motion for horizontal gratings. When participants made a saccade perpendicular to the gratings' orientation (e.g., a leftward saccade for a vertical grating), the eye's peak velocity matched the gratings' motion. As a result, the retinal image was approximately stable for a brief moment during the saccade, and this gave rise to an intrasaccadic percept: A normally invisible stimulus became visible when eye velocity was maximal. Our results confirm and extend previous studies by demonstrating intrasaccadic perception using a reflexive measure (pupillometry) that does not rely on subjective report. Our results further show that intrasaccadic perception affects all stages of visual processing, from the pupillary response to visual awareness. PMID:26339536

  1. Fluid pressure waves trigger earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2015-03-01

    Fluids-essentially meteoric water-are present everywhere in the Earth's crust, occasionally also with pressures higher than hydrostatic due to the tectonic strain imposed on impermeable undrained layers, to the impoundment of artificial lakes or to the forced injections required by oil and gas exploration and production. Experimental evidence suggests that such fluids flow along preferred paths of high diffusivity, provided by rock joints and faults. Studying the coupled poroelastic problem, we find that such flow is ruled by a nonlinear partial differential equation amenable to a Barenblatt-type solution, implying that it takes place in form of solitary pressure waves propagating at a velocity which decreases with time as v ∝ t [1/(n - 1) - 1] with n ≳ 7. According to Tresca-Von Mises criterion, these waves appear to play a major role in earthquake triggering, being also capable to account for aftershock delay without any further assumption. The measure of stress and fluid pressure inside active faults may therefore provide direct information about fault potential instability.

  2. Basic concepts and architectural details of the Delphi trigger system

    SciTech Connect

    Bocci, V.; Booth, P.S.L.; Bozzo, M. |

    1995-08-01

    Delphi (DEtector with Lepton, Photon and Hadron Identification) is one of the four experiments of the LEP (Large Electron Positron) collider at CERN. The detector is laid out to provide a nearly 4 {pi} coverage for charged particle tracking, electromagnetic, hadronic calorimetry and extended particle identification. The trigger system consists of four levels. The first two are synchronous with the BCO (Beam Cross Over) and rely on hardwired control units, while the last two are performed asynchronously with respect to the BCO and are driven by the Delphi host computers. The aim of this paper is to give a comprehensive global view of the trigger system architecture, presenting in detail the first two levels, their various hardware components and the latest modifications introduced in order to improve their performance and make more user friendly the whole software user interface.

  3. Optoelectronic date acquisition system based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Xin; Liu, Chunyang; Song, De; Tong, Zhiguo; Liu, Xiangqing

    2015-11-01

    An optoelectronic date acquisition system is designed based on FPGA. FPGA chip that is EP1C3T144C8 of Cyclone devices from Altera corporation is used as the centre of logic control, XTP2046 chip is used as A/D converter, host computer that communicates with the date acquisition system through RS-232 serial communication interface are used as display device and photo resistance is used as photo sensor. We use Verilog HDL to write logic control code about FPGA. It is proved that timing sequence is correct through the simulation of ModelSim. Test results indicate that this system meets the design requirement, has fast response and stable operation by actual hardware circuit test.

  4. Parameter Resetting in Second Language Acquisition. University Research Institute Final Project Report, 1987-88.

    ERIC Educational Resources Information Center

    Phinney-Liapis, Marianne

    Analyses of the Null Subject Parameter (NSP) suggest that several factors may influence the resetting process for second language acquisition, such as specific "trigger" data, awareness of agreement as a part of awareness of agreement (INFL), and stylistic rules such as subject postposing and anaphoric reference. Four tests were administered to…

  5. The CDF LEVEL3 trigger

    SciTech Connect

    Carroll, T.; Joshi, U.; Auchincloss, P.

    1989-04-01

    CDF is currently taking data at a luminosity of 10{sup 30} cm{sup -2} sec{sup -1} using a four level event filtering scheme. The fourth level, LEVEL3, uses ACP (Fermilab`s Advanced Computer Program) designed 32 bit VME based parallel processors (1) capable of executing algorithms written in FORTRAN. LEVEL3 currently rejects about 50% of the events.

  6. Aspirin-triggered metabolites of EFAs.

    PubMed

    Makriyannis, Alexandros; Nikas, Spyros P

    2011-10-28

    Aspirin triggers the biosynthesis of oxygenated metabolites from arachidonic, eicosapentaenoic, and docosahexaenoic (DHA) acids. In a preceding issue, Serhan et al. (2011) describe a novel aspirin-triggered DHA pathway for the biosynthesis of a potent anti-inflammatory and proresolving molecule. PMID:22035788

  7. Hierarchical trigger of the ALICE calorimeters

    NASA Astrophysics Data System (ADS)

    Muller, Hans; Awes, Terry C.; Novitzky, Norbert; Kral, Jiri; Rak, Jan; Schambach, Jo; Wang, Yaping; Wang, Dong; Zhou, Daicui

    2010-05-01

    The trigger of the ALICE electromagnetic calorimeters is implemented in 2 hierarchically connected layers of electronics. In the lower layer, level-0 algorithms search shower energy above threshold in locally confined Trigger Region Units (TRU). The top layer is implemented as a single, global trigger unit that receives the trigger data from all TRUs as input to the level-1 algorithm. This architecture was first developed for the PHOS high pT photon trigger before it was adopted by EMCal also for the jet trigger. TRU units digitize up to 112 analogue input signals from the Front End Electronics (FEE) and concentrate their digital stream in a single FPGA. A charge and time summing algorithm is combined with a peakfinder that suppresses spurious noise and is precise to single LHC bunches. With a peak-to-peak noise level of 150 MeV the linear dynamic range above threshold spans from MIP energies at 215 up to 50 GeV. Local level-0 decisions take less than 600 ns after LHC collisions, upon which all TRUs transfer their level-0 trigger data to the upstream global trigger module which searches within the remaining level-1 latency for high pT gamma showers (PHOS) and/or for Jet cone areas (EMCaL).

  8. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  9. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  10. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  11. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  12. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation....

  13. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of...

  14. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of...

  15. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of...

  16. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of...

  17. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of...

  18. Psychological tools for knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rueter, Henry H.; Olson, Judith Reitman

    1988-01-01

    Knowledge acquisition is said to be the biggest bottleneck in the development of expert systems. The problem is getting the knowledge out of the expert's head and into a computer. In cognitive psychology, characterizing metal structures and why experts are good at what they do is an important research area. Is there some way that the tools that psychologists have developed to uncover mental structure can be used to benefit knowledge engineers? We think that the way to find out is to browse through the psychologist's toolbox to see what there is in it that might be of use to knowledge engineers. Expert system developers have relied on two standard methods for extracting knowledge from the expert: (1) the knowledge engineer engages in an intense bout of interviews with the expert or experts, or (2) the knowledge engineer becomes an expert himself, relying on introspection to uncover the basis of his own expertise. Unfortunately, these techniques have the difficulty that often the expert himself isn't consciously aware of the basis of his expertise. If the expert himself isn't conscious of how he solves problems, introspection is useless. Cognitive psychology has faced similar problems for many years and has developed exploratory methods that can be used to discover cognitive structure from simple data.

  19. The H1 neural network trigger project

    NASA Astrophysics Data System (ADS)

    Kiesling, C.; Denby, B.; Fent, J.; Fröchtenicht, W.; Garda, P.; Granado, B.; Grindhammer, G.; Haberer, W.; Janauschek, L.; Kobler, T.; Koblitz, B.; Nellen, G.; Prevotet, J.-C.; Schmidt, S.; Tzamariudaki, E.; Udluft, S.

    2001-08-01

    We present a short overview of neuromorphic hardware and some of the physics projects making use of such devices. As a concrete example we describe an innovative project within the H1-Experiment at the electron-proton collider HERA, instrumenting hardwired neural networks as pattern recognition machines to discriminate between wanted physics and uninteresting background at the trigger level. The decision time of the system is less than 20 microseconds, typical for a modern second level trigger. The neural trigger has been successfully running for the past four years and has turned out new physics results from H1 unobtainable so far with other triggering schemes. We describe the concepts and the technical realization of the neural network trigger system, present the most important physics results, and motivate an upgrade of the system for the future high luminosity running at HERA. The upgrade concentrates on "intelligent preprocessing" of the neural inputs which help to strongly improve the networks' discrimination power.

  20. The LHCb trigger and its upgrade

    NASA Astrophysics Data System (ADS)

    Dziurda, A.

    2016-07-01

    The current LHCb trigger system consists of a hardware level, which reduces the LHC inelastic collision rate of 30 MHz, at which the entire detector is read out. In a second level, implemented in a farm of 20 k parallel-processing CPUs, the event rate is reduced to about 5 kHz. We review the performance of the LHCb trigger system during Run I of the LHC. Special attention is given to the use of multivariate analyses in the High Level Trigger. The major bottleneck for hadronic decays is the hardware trigger. LHCb plans a major upgrade of the detector and DAQ system in the LHC shutdown of 2018, enabling a purely software based trigger to process the full 30 MHz of inelastic collisions delivered by the LHC. We demonstrate that the planned architecture will be able to meet this challenge.