Science.gov

Sample records for germcode gcr event-based

  1. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  2. Development of a GCR Event-based Risk Model

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  3. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  4. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  5. Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.

    2014-01-01

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  6. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  7. GCR environmental models I: Sensitivity analysis for GCR environments

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-04-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z > 2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  8. GCR Environmental Models I: Sensitivity Analysis for GCR Environments

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-01-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z>2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  9. Mutations in Gcr1, a Transcriptional Activator of Saccharomyces Cerevisiae Glycolytic Genes, Function as Suppressors of Gcr2 Mutations

    PubMed Central

    Uemura, H.; Jigami, Y.

    1995-01-01

    The Saccharomyces cerevisiae GCR1 and GCR2 genes affect expression of most of the glycolytic genes. Evidence for Gcr1p/Gcr2p interaction has been presented earlier and is now supported by the isolation of mutations in Gcr1p suppressing gcr2, as assessed by growth and enzyme assay. Four specific mutation sites were identified. Together with use of the two-hybrid system of FIELDS and SONG, they show that Gcr1p in its N-terminal half has a potential transcriptional activating function as well as elements for interaction with Gcr2p, which perhaps acts normally to expose an otherwise cryptic activation domain on Gcr1p. Complementation of various gcr1 mutant alleles and results with the two-hybrid system also indicate that Gcr1p itself normally functions as an oligomer. PMID:7713414

  10. Event-Based Science.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    1992-01-01

    Suggests that an event-based science curriculum can provide the framework for deciding what to retain in an overloaded science curriculum. Provides examples of current events and the science concepts explored related to the event. (MDH)

  11. Nuclear interactions in heavy ion transport and event-based risk models.

    PubMed

    Cucinotta, Francis A; Plante, Ianik; Ponomarev, Artem L; Kim, Myung-Hee Y

    2011-02-01

    The physical description of the passage of heavy ions in tissue and shielding materials is of interest in radiobiology, cancer therapy and space exploration, including a human mission to Mars. Galactic cosmic rays (GCRs) consist of a large number of ion types and energies. Energy loss processes occur continuously along the path of heavy ions and are well described by the linear energy transfer (LET), straggling and multiple scattering algorithms. Nuclear interactions lead to much larger energy deposition than atomic-molecular collisions and alter the composition of heavy ion beams while producing secondary nuclei often in high multiplicity events. The major nuclear interaction processes of importance for describing heavy ion beams was reviewed, including nuclear fragmentation, elastic scattering and knockout-cascade processes. The quantum multiple scattering fragmentation model is shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections and is studied for application to thick target experiments. A new computer model, which was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL), called the GCR Event Risk-Based Model (GERMcode) is described. PMID:21242169

  12. Isotopic Dependence of GCR Fluence behind Shielding

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Wilson, John W.; Saganti, Premkumar; Kim, Myung-Hee Y.; Cleghorn, Timothy; Zeitlin, Cary; Tripathi, Ram K.

    2006-01-01

    In this paper we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR), nuclear fragmentation cross-sections, and isotopic-grid on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. For the nuclear interaction data-base and transport solution, we use the quantum multiple-scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, respectively. The QMSFRG model is shown to accurately describe existing fragmentation data including proper description of the odd-even effects as function of the iso-spin dependence on the projectile nucleus. The principle finding of this study is that large errors (+/-100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotopic-grid (approx.170 ions) to ones that use a reduced isotopic-grid, for example the 59 ion-grid used in the HZETRN code in the past, however less significant errors (<+/-20%) occur in the elemental-fluence spectra. Because a complete isotopic-grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies.

  13. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  14. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    NASA Technical Reports Server (NTRS)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  15. Isotopic Effects in Nuclear Fragmentation and GCR Transport Problems

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2002-01-01

    Improving the accuracy of the galactic cosmic ray (GCR) environment and transport models is an important goal in preparing for studies of the projected risks and the efficiency of potential mitigations methods for space exploration. In this paper we consider the effects of the isotopic composition of the primary cosmic rays and the isotopic dependence of nuclear fragmentation cross sections on GCR transport models. Measurements are used to describe the isotopic composition of the GCR including their modulation throughout the solar cycle. The quantum multiple-scattering approach to nuclear fragmentation (QMSFRG) is used as the data base generator in order to accurately describe the odd-even effect in fragment production. Using the Badhwar and O'Neill GCR model, the QMSFRG model and the HZETRN transport code, the effects of the isotopic dependence of the primary GCR composition and on fragment production for transport problems is described for a complete GCR isotopic-grid. The principle finding of this study is that large errors ( 100%) will occur in the mass-flux spectra when comparing the complete isotopic-grid (141 ions) to a reduced isotopic-grid (59 ions), however less significant errors 30%) occur in the elemental-flux spectra. Because the full isotopic-grid is readily handled on small computer work-stations, it is recommended that they be used for future GCR studies.

  16. A Stochastic Model of Space Radiation Transport as a Tool in the Development of Time-Dependent Risk Assessment

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2011-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections

  17. GCR Simulator Development Status at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Slaba, T. C.; Norbury, J. W.; Blattnig, S. R.

    2015-01-01

    There are large uncertainties connected to the biological response for exposure to galactic cosmic rays (GCR) on long duration deep space missions. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed with mono-energetic ions beams. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment at the NASA Space Radiation Laboratory (NSRL) are discussed. First, comparisons are made between direct simulation of the external, free space GCR field, and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, a reference environment for the GCR simulator and suitable for deep space missions is identified and described in terms of fluence and integrated dosimetric quantities. Analysis results are given to justify the use of a single reference field over a range of shielding conditions and solar activities. Third, an approach for simulating the reference field at NSRL is presented. The approach directly considers the hydrogen and helium energy spectra, and the heavier ions are collectively represented by considering the linear energy transfer (LET) spectrum. While many more aspects of the experimental setup need to be considered before final implementation of the GCR simulator, this preliminary study provides useful information that should aid the final design. Possible drawbacks of the proposed methodology are discussed and weighed

  18. Host Event Based Network Monitoring

    SciTech Connect

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  19. GCR as a source for Inner radiation belt of Saturn.

    NASA Astrophysics Data System (ADS)

    Kotova, A.; Roussos, E.; Krupp, N.; Dandouras, I. S.

    2014-12-01

    During the insertion orbit of Cassini in 2004 the Ion and Neutron Camera measured significant fluxes of the energetic neutral atoms (ENA) coming from the area between the D-ring and the Saturn's atmosphere, what brought up the idea of the possible existence of the innermost radiation belt in this narrow gap (1). There are two main sources of energetic charged particles for such inner radiation belt: the interaction of the Galactic Cosmic Rays (GCR) with the Saturn's atmosphere and rings, which due to CRAND process can produce the keV-MeV ions or electrons in the region, and the double charge exchange of the ENAs, coming from the middle magnetosphere, what can bring the keV ions to the region of our interest. Using the particles tracer, which was developed in our group, and GEANT4 software, we study in details those two processes. With a particle tracer we evaluate the GCR access to the Saturn atmosphere and rings. Simulation of the GCR trajectories allows to calculate the energy spectra of the arriving energetic particles, which is much more accurate, compare to the analytically predicted spectra using the Stoermer theory, since simulation includes effects of the ring shadow and non-dipolar processes in the magnetosphere. Using the GEANT4 software the penetration of the GCR through the matter of rings was simulated, and the production of secondaries particles was estimated. Finally, the motion of secondaries was simulated using the particles tracer, and evaluation of the energy spectrum of neutrons the decay of which leads to the production of final CRAND elements in the inner Saturnian radiation belts was done. We show that for inner radiation belt most energetic ions comes from GCR interaction with rings, it's penetration and from interaction of secondaries with Saturn's atmosphere. This simulation allows us to predict the fluxes of energetic ions and electrons, which particle detector MIMI/LEMMS onboard the Cassini can measure during the so-called "proximal

  20. Nuclear fragmentation database for GCR transport code development

    NASA Astrophysics Data System (ADS)

    Zeitlin, C.; Guetersloh, S.; Heilbronn, L.; Miller, J.; Fukumura, A.; Iwata, Y.; Murakami, T.; Sihver, L.

    2010-09-01

    A critical need for NASA is the ability to accurately model the transport of heavy ions in the Galactic Cosmic Rays (GCR) through matter, including spacecraft walls, equipment racks, etc. Nuclear interactions are of great importance in the GCR transport problem, as they can cause fragmentation of the incoming ion into lighter ions. Since the radiation dose delivered by a particle is proportional to the square of (charge/velocity), fragmentation reduces the dose delivered by incident ions. The other mechanism by which dose can be reduced is ionization energy loss, which can lead to some particles stopping in the shielding. This is the conventional notion of shielding, but it is not applicable to human spaceflight since the particles in the GCR tend to be too energetic to be stopped in the relatively thin shielding that is possible within payload mass constraints. Our group has measured a large number of fragmentation cross sections, intended to be used as input to, or for validation of, NASA's radiation transport models. A database containing over 200 charge-changing cross sections and over 2000 fragment production cross sections has been compiled. In this report, we examine in detail the contrast between fragment measurements at large acceptance and small acceptance. We use output from the PHITS Monte Carlo code to test our assumptions using as an example 40Ar data (and simulated data) at a beam energy of 650 MeV/nucleon. We also present preliminary analysis in which isotopic resolution was attained for beryllium fragments produced by beams of 10B and 11B. Future work on the experimental data set will focus on extracting and interpreting production cross sections for light fragments.

  1. The Dynamic Outer Heliosphere and Preliminary Analysis of GCR Trajectories

    NASA Astrophysics Data System (ADS)

    Washimi, Haruichi; Zank, Gary P.; Hu, Qiang; Tanaka, Takashi; Munakata, Kazuoki; Shinagawa, Hiroyuki

    2010-12-01

    We show realistic and time-varying 3D MHD models of the outer heliosphere which satisfy both Voyager 1 (V1) and Voyager 2 (V2) observed crossing times of the termination shock (TS) simultaneously. The short-term variations found are a) the TS position increases whenever a solar-wind high-ram pressure pulse collides with the TS, b) a large amplitude magneto-sonic pulse is generated downstream of the TS when a solar-wind high ram pressure pulse collides with the TS, c) the generated pulse propagates outward in the heliosheath and is reflected at the plasma sheet, and d) when the reflected pulse collides with the TS, the TS position decreases. We also present preliminary results of galactic cosmic rays (GCRs) trajectories as they respond to three-dimensional global electric and magnetic fields in the outer heliosphere. This allows us to investigate (1) how GCRs cross the heliosphere and enter the inner heliosphere, and (2) their long-term variation. Preliminary GCR distributions in the outer heliosphere are shown. GCR diffusion due to magnetic-field fluctuations is not taken into account in this analysis.

  2. Asynchronous event-based hebbian epipolar geometry.

    PubMed

    Benosman, Ryad; Ieng, Sio-Hoï; Rogister, Paul; Posch, Christoph

    2011-11-01

    Epipolar geometry, the cornerstone of perspective stereo vision, has been studied extensively since the advent of computer vision. Establishing such a geometric constraint is of primary importance, as it allows the recovery of the 3-D structure of scenes. Estimating the epipolar constraints of nonperspective stereo is difficult, they can no longer be defined because of the complexity of the sensor geometry. This paper will show that these limitations are, to some extent, a consequence of the static image frames commonly used in vision. The conventional frame-based approach suffers from a lack of the dynamics present in natural scenes. We introduce the use of neuromorphic event-based--rather than frame-based--vision sensors for perspective stereo vision. This type of sensor uses the dimension of time as the main conveyor of information. In this paper, we present a model for asynchronous event-based vision, which is then used to derive a general new concept of epipolar geometry linked to the temporal activation of pixels. Practical experiments demonstrate the validity of the approach, solving the problem of estimating the fundamental matrix applied, in a first stage, to classic perspective vision and then to more general cameras. Furthermore, this paper shows that the properties of event-based vision sensors allow the exploration of not-yet-defined geometric relationships, finally, we provide a definition of general epipolar geometry deployable to almost any visual sensor. PMID:21954205

  3. Landscape of international event-based biosurveillance

    PubMed Central

    Hartley, DM; Nelson, NP; Walters, R; Arthur, R; Yangarber, R; Madoff, L; Linge, JP; Mawudeku, A; Collier, N; Brownstein, JS; Thinus, G; Lightfoot, N

    2010-01-01

    Event-based biosurveillance is a scientific discipline in which diverse sources of data, many of which are available from the Internet, are characterized prospectively to provide information on infectious disease events. Biosurveillance complements traditional public health surveillance to provide both early warning of infectious disease events and situational awareness. The Global Health Security Action Group of the Global Health Security Initiative is developing a biosurveillance capability that integrates and leverages component systems from member nations. This work discusses these biosurveillance systems and identifies needed future studies. PMID:22460393

  4. Asynchronous event-based binocular stereo matching.

    PubMed

    Rogister, Paul; Benosman, Ryad; Ieng, Sio-Hoi; Lichtsteiner, Patrick; Delbruck, Tobi

    2012-02-01

    We present a novel event-based stereo matching algorithm that exploits the asynchronous visual events from a pair of silicon retinas. Unlike conventional frame-based cameras, recent artificial retinas transmit their outputs as a continuous stream of asynchronous temporal events, in a manner similar to the output cells of the biological retina. Our algorithm uses the timing information carried by this representation in addressing the stereo-matching problem on moving objects. Using the high temporal resolution of the acquired data stream for the dynamic vision sensor, we show that matching on the timing of the visual events provides a new solution to the real-time computation of 3-D objects when combined with geometric constraints using the distance to the epipolar lines. The proposed algorithm is able to filter out incorrect matches and to accurately reconstruct the depth of moving objects despite the low spatial resolution of the sensor. This brief sets up the principles for further event-based vision processing and demonstrates the importance of dynamic information and spike timing in processing asynchronous streams of visual events. PMID:24808513

  5. On event-based optical flow detection

    PubMed Central

    Brosch, Tobias; Tschechne, Stephan; Neumann, Heiko

    2015-01-01

    Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. PMID:25941470

  6. Down-regulated Lotus japonicus GCR1 plants exhibit nodulation signalling pathways alteration.

    PubMed

    Rogato, Alessandra; Valkov, Vladimir Totev; Alves, Ludovico Martins; Apone, Fabio; Colucci, Gabriella; Chiurazzi, Maurizio

    2016-06-01

    G Protein Coupled Receptor (GPCRs) are integral membrane proteins involved in various signalling pathways by perceiving many extracellular signals and transducing them to heterotrimeric G proteins, which further transduce these signals to intracellular downstream effectors. GCR1 is the only reliable plant candidate as a member of the GPCRs superfamily. In the legume/rhizobia symbiotic interaction, G proteins are involved in signalling pathways controlling different steps of the nodulation program. In order to investigate the putative hierarchic role played by GCR1 in these symbiotic pathways we identified and characterized the Lotus japonicus gene encoding the seven transmembrane GCR1 protein. The detailed molecular and topological analyses of LjGCR1 expression patterns that are presented suggest a possible involvement in the early steps of nodule organogenesis. Furthermore, phenotypic analyses of independent transgenic RNAi lines, showing a significant LjGCR1 expression down regulation, suggest an epistatic action in the control of molecular markers of nodulation pathways, although no macroscopic symbiotic phenotypes could be revealed. PMID:27095401

  7. Flexible gray component replacement (GCR) based on CIE L*a*b*

    NASA Astrophysics Data System (ADS)

    Ogatsu, Hitoshi; Murai, Kazumasa; Kita, Shinji

    1995-04-01

    To improve the color fidelity of 4 color reproduction and to increase the flexibility of Gray Component Replacement (GCR) for the text and continuous images, a novel GCR algorithm based on CIE L*a*b* signals is proposed. The algorithm consist of (1) maximum (achromatic) black determination part, (2) black adjustment part based on chroma, and (3) 3 color determination part. On this configuration, black signal is determined ahead of MCY signals, and the freedom of 3 input i.e L*a*b* 4 output i.e. CMYBk conversion is concentrated in (2). By using xerographic color printer, by neural network technique for resolving this, the algorithm is examined. As a result, it is shown that the algorithm can conserve the color fidelity in any GCR rate, and which is applicable on both of text and continuous images.

  8. Effect of Implantation Sequence on Tribological Behavior of GCr15 Steel by PBII

    NASA Astrophysics Data System (ADS)

    Gu, Le; Zhou, Hui; Cao, Guojian; Tang, Guangze; Ma, Xinxin; Wang, Liqin

    2016-05-01

    In the present work, the effect of implantation sequence on tribological behavior of GCr15 steel treated by plasma-based ion implantation of carbon and nitrogen has been investigated. The treated GCr15 steels were characterized for microstructure and abrasive wear performance through combination of Raman spectroscopy, nano-indentation, and wear tests. Raman spectroscopy indicated that diamond-like carbon (DLC) films were formed after implantation of carbon with or without implantation of nitrogen, and the implantation of nitrogen after the implantation of carbon destroyed the graphite structure of the DLC films. The nano-indentation and wear tests showed that nanohardness as well as wear resistance of the GCr15 steel treated with the implantation sequence of nitrogen-carbon was better than those with the implantation sequence of carbon-nitrogen. Meanwhile, the properties were improved with increasing of carbon ion fluence.

  9. GCR Simulator Reference Field and a Spectral Approach for Laboratory Simulation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.; Norbury, John W.; Rusek, Adam; La Tessa, Chiara; Walker, Steven A.

    2015-01-01

    The galactic cosmic ray (GCR) simulator at the NASA Space Radiation Laboratory (NSRL) is intended to deliver the broad spectrum of particles and energies encountered in deep space to biological targets in a controlled laboratory setting. In this work, certain aspects of simulating the GCR environment in the laboratory are discussed. Reference field specification and beam selection strategies at NSRL are the main focus, but the analysis presented herein may be modified for other facilities. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is likely within the uncertainty associated with representing any GCR reference field with discrete ion beams in the laboratory, given current facility constraints. A single reference field for deep space missions is subsequently identified. Third, an approach for selecting beams at NSRL to simulate the designated reference field is presented. Drawbacks of the proposed methodology are discussed and weighed against alternative simulation strategies. The neutron component and track structure characteristics of the simulated field are discussed in this context.

  10. Reference field specification and preliminary beam selection strategy for accelerator-based GCR simulation

    NASA Astrophysics Data System (ADS)

    Slaba, Tony C.; Blattnig, Steve R.; Norbury, John W.; Rusek, Adam; La Tessa, Chiara

    2016-02-01

    The galactic cosmic ray (GCR) simulator at the NASA Space Radiation Laboratory (NSRL) is intended to deliver the broad spectrum of particles and energies encountered in deep space to biological targets in a controlled laboratory setting. In this work, certain aspects of simulating the GCR environment in the laboratory are discussed. Reference field specification and beam selection strategies at NSRL are the main focus, but the analysis presented herein may be modified for other facilities and possible biological considerations. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is likely within the uncertainty associated with representing any GCR reference field with discrete ion beams in the laboratory, given current facility constraints. A single reference field for deep space missions is subsequently identified. Third, a preliminary approach for selecting beams at NSRL to simulate the designated reference field is presented. This approach is not a final design for the GCR simulator, but rather a single step within a broader design strategy. It is shown that the beam selection methodology is tied directly to the reference environment, allows facility constraints to be incorporated, and may be adjusted to account for additional constraints imposed by biological or animal care considerations. The major biology questions are not addressed herein but are discussed in a companion paper published in the present issue of this journal. Drawbacks of the proposed methodology are discussed

  11. Reference field specification and preliminary beam selection strategy for accelerator-based GCR simulation.

    PubMed

    Slaba, Tony C; Blattnig, Steve R; Norbury, John W; Rusek, Adam; La Tessa, Chiara

    2016-02-01

    The galactic cosmic ray (GCR) simulator at the NASA Space Radiation Laboratory (NSRL) is intended to deliver the broad spectrum of particles and energies encountered in deep space to biological targets in a controlled laboratory setting. In this work, certain aspects of simulating the GCR environment in the laboratory are discussed. Reference field specification and beam selection strategies at NSRL are the main focus, but the analysis presented herein may be modified for other facilities and possible biological considerations. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is likely within the uncertainty associated with representing any GCR reference field with discrete ion beams in the laboratory, given current facility constraints. A single reference field for deep space missions is subsequently identified. Third, a preliminary approach for selecting beams at NSRL to simulate the designated reference field is presented. This approach is not a final design for the GCR simulator, but rather a single step within a broader design strategy. It is shown that the beam selection methodology is tied directly to the reference environment, allows facility constraints to be incorporated, and may be adjusted to account for additional constraints imposed by biological or animal care considerations. The major biology questions are not addressed herein but are discussed in a companion paper published in the present issue of this journal. Drawbacks of the proposed methodology are discussed

  12. Multi-field coupled numerical simulation of hot reversible rolling process of GCr15 steel rod

    NASA Astrophysics Data System (ADS)

    Gu, Sendong; Zhang, Liwen; Ruan, Jinhua; Mei, Hongyu; Zhen, Yu; Shi, Xinhua

    2013-05-01

    In this paper, based on rolling technology of hot reversible rolling mill, a multi-filed coupled finite element (FE) model of hot reversible rolling process of large dimension cross-section GCr15 steel rod is established. Thermal, mechanical and microstructural phenomena during the rolling process are coupled in the model. By employing grain growth experiment, double and single hit hot compression experiments, the austenite grain size growth mathematical model and recrystallization behavior mathematical models are determined. And a designed subprogram is coupled in the FE model. Actual hot reversible rolling process of GCr15 steel is simulated using the model and the distribution and evolution of different filed-variables, such as temperature, effective strain and austenite grain size are obtained. To verify the model predictions, hot rolling experiments are carried out and the temperature and microstructure of the rolling metal are compared with the predicted results. The comparison between the two sets of data shows a good agreement.

  13. A Reference Field for GCR Simulation and an LET-Based Implementation at NSRL

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.; Walker, Steven A.; Norbury, John W.

    2015-01-01

    Exposure to galactic cosmic rays (GCR) on long duration deep space missions presents a serious health risk to astronauts, with large uncertainties connected to the biological response. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment in the laboratory are discussed. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at the NASA Space Radiation Laboratory (NSRL) limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is within physical uncertainties, allowing a single reference field for deep space missions to be defined. Third, an approach for simulating the reference field at NSRL is presented. The approach allows for the linear energy transfer (LET) spectrum of the reference field to be approximately represented with discrete ion and energy beams and implicitly maintains a reasonably accurate charge spectrum (or, average quality factor). Drawbacks of the proposed methodology are discussed and weighed against alternative simulation strategies. The neutron component and track structure characteristics of the proposed strategy are discussed in this context.

  14. Effect of Ultrasonic Treatment on the Solidification Microstructure of GCr15 Bearing Steel

    NASA Astrophysics Data System (ADS)

    Wang, Jianjun; Shi, Xiaofang; Chang, Lizhong; Wang, Haijun; Meng, Lipeng

    2016-02-01

    Ultrasonic treatment with various powers is introduced to liquid steel from the side wall of a mold during GCr15 steel solidification, and the effect of ultrasonic on the microstructure and properties of GCr15 steel is investigated. Results show that the columnar grains in the GCr15 steel are coarse and that the microstructure is inhomogeneous when ultrasonic is not applied on the liquid steel. A suitable power ultrasonic leads to the appearance of a large number of equiaxed grains and increases the uniformity of the microstructure. The segregation of alloy elements gradually decreases as the power increases from 0 W to 500 W. The maximum segregations of carbon and silicon decrease from 2.541 to 1.129 and 2.861 to 1.196, respectively. Given a power of 500 W, the statistical segregations of carbon and silicon decrease from 0.0964 to 0.0693 and 0.1152 to 0.1075, respectively. A further increase in ultrasonic power is not conducive for improving the element segregation. Ultrasonic treatment can remarkably refine the size of carbide and increase the uniformity of its distribution. When the powers are 0 W, 300 W, 500 W, 700 W, and 1,000 W, the average sizes of carbide are 14.63 μm, 2.96 μm, 3.05 μm, 3.72 μm, and 7.83 μm, respectively. The tensile strength, yield strength, and ductility and reduction of the area of the GCr15 bearing steel are correspondingly improved to varying degrees.

  15. Badhwar-O'Neil 2007 Galactic Cosmic Ray (GCR) Model Using Advanced Composition Explorer (ACE) Measurements for Solar Cycle 23

    NASA Technical Reports Server (NTRS)

    ONeill, P. M.

    2007-01-01

    Advanced Composition Explorer (ACE) satellite measurements of the galactic cosmic ray flux and correlation with the Climax Neutron Monitor count over Solar Cycle 23 are used to update the Badhwar O'Neill Galactic Cosmic Ray (GCR) model.

  16. Geo-effectiveness and GCR-effectiveness of Interplanetary Coronal Mass Ejections Observed during the Solar Cycle 24

    NASA Astrophysics Data System (ADS)

    Aslam, O. P. M.; Badruddin, B.

    2016-07-01

    We study the geomagnetic and Galactic Cosmic Ray (GCR) response of Interplanetary Coronal Mass ejections (ICMEs) observed for the period of 2010 - 2015. We identify the distinct features of ICMEs during their passage. We analyze the hourly resolution data of geomagnetic indices and ground based neutron monitors, with the simultaneous and same time resolution data of interplanetary plasma and field parameters to identify the features of ICMEs and solar wind parameters during their passage when GCR intensity is affected to its maximum level. Similarly, we identify features of ICMEs and solar wind parameters during their passage when geo-effectiveness is at its maximum level. We discuss the similarities and distinctions in the Geo-effectiveness and GCR-effectiveness of the same ICME structure in the light of plasma and field variations, and physical mechanism(s) playing important role in influencing the GCR intensity and geomagnetic activity.

  17. HZETRN: neutron and proton production in quasi-elastic scattering of GCR heavy-ions

    NASA Technical Reports Server (NTRS)

    Shavers, M. R.; Cucinotta, F. A.; Wilson, J. W.

    2001-01-01

    The development of transport models for radiation shielding design and evaluation has provided a series of deterministic computer codes that describe galactic cosmic radiation (GCR), solar particle events, and experimental beams at particle accelerators. These codes continue to be modified to accommodate new theory and improvements to the particle interaction database (Cucinotta et al., 1994, NASA Technical Paper 3472, US Government Printing Office, Washington DC). The solution employed by the heavy-ion transport code HZETRN was derived with the assumption that nuclear fragments are emitted with the same velocity as the incident ion through velocity conserving nuclear interactions. This paper presents a version of the HZETRN transport code that provides a more realistic distribution of the energy of protons and neutrons emitted from GCR interactions in shields. This study shows that the expected GCR dose equivalent is lower than previously calculated for water shields that are less than 110 g cm-2 thick. Calculations of neutron energy spectra in low Earth orbit indicate substantial contributions from relativistic neutrons. c2001 Elsevier Science Ltd. All rights reseved.

  18. Characterization of the DNA-binding activity of GCR1: in vivo evidence for two GCR1-binding sites in the upstream activating sequence of TPI of Saccharomyces cerevisiae.

    PubMed Central

    Huie, M A; Scott, E W; Drazinic, C M; Lopez, M C; Hornstra, I K; Yang, T P; Baker, H V

    1992-01-01

    GCR1 gene function is required for high-level glycolytic gene expression in Saccharomyces cerevisiae. Recently, we suggested that the CTTCC sequence motif found in front of many genes encoding glycolytic enzymes lay at the core of the GCR1-binding site. Here we mapped the DNA-binding domain of GCR1 to the carboxy-terminal 154 amino acids of the polypeptide. DNase I protection studies showed that a hybrid MBP-GCR1 fusion protein protected a region of the upstream activating sequence of TPI (UASTPI), which harbored the CTTCC sequence motif, and suggested that the fusion protein might also interact with a region of the UAS that contained the related sequence CATCC. A series of in vivo G methylation protection experiments of the native TPI promoter were carried out with wild-type and gcr1 deletion mutant strains. The G doublets that correspond to the C doublets in each site were protected in the wild-type strain but not in the gcr1 mutant strain. These data demonstrate that the UAS of TPI contains two GCR1-binding sites which are occupied in vivo. Furthermore, adjacent RAP1/GRF1/TUF- and REB1/GRF2/QBP/Y-binding sites in UASTPI were occupied in the backgrounds of both strains. In addition, DNA band-shift assays were used to show that the MBP-GCR1 fusion protein was able to form nucleoprotein complexes with oligonucleotides that contained CTTCC sequence elements found in front of other glycolytic genes, namely, PGK, ENO1, PYK, and ADH1, all of which are dependent on GCR1 gene function for full expression. However, we were unable to detect specific interactions with CTTCC sequence elements found in front of the translational component genes TEF1, TEF2, and CRY1. Taken together, these experiments have allowed us to propose a consensus GCR1-binding site which is 5'-(T/A)N(T/C)N(G/A)NC(T/A)TCC(T/A)N(T/A)(T/A)(T/G)-3'. Images PMID:1588965

  19. Asynchronous visual event-based time-to-contact.

    PubMed

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing. PMID:24570652

  20. Asynchronous visual event-based time-to-contact

    PubMed Central

    Clady, Xavier; Clercq, Charles; Ieng, Sio-Hoi; Houseini, Fouzhan; Randazzo, Marco; Natale, Lorenzo; Bartolozzi, Chiara; Benosman, Ryad

    2014-01-01

    Reliable and fast sensing of the environment is a fundamental requirement for autonomous mobile robotic platforms. Unfortunately, the frame-based acquisition paradigm at the basis of main stream artificial perceptive systems is limited by low temporal dynamics and redundant data flow, leading to high computational costs. Hence, conventional sensing and relative computation are obviously incompatible with the design of high speed sensor-based reactive control for mobile applications, that pose strict limits on energy consumption and computational load. This paper introduces a fast obstacle avoidance method based on the output of an asynchronous event-based time encoded imaging sensor. The proposed method relies on an event-based Time To Contact (TTC) computation based on visual event-based motion flows. The approach is event-based in the sense that every incoming event adds to the computation process thus allowing fast avoidance responses. The method is validated indoor on a mobile robot, comparing the event-based TTC with a laser range finder TTC, showing that event-based sensing offers new perspectives for mobile robotics sensing. PMID:24570652

  1. 27-day variation of the GCR intensity based on corrected and uncorrected for geomagnetic disturbances data of neutron monitors

    NASA Astrophysics Data System (ADS)

    Alania, M. V.; Modzelewska, R.; Wawrzynczak, A.; Sdobnov, V. E.; Kravtsova, M. V.

    2015-08-01

    We study 27-day variations of the galactic cosmic ray (GCR) intensity for 2005-2008 period of the solar cycle #23. We use neutron monitors (NMs) data corrected and uncorrected for geomagnetic disturbances. Besides the limited time intervals when the 27-day variations are clearly established, always exist some feeble 27-day variations in the GCR intensity related to the constantly present weak heliolongitudinal asymmetry in the heliosphere. We calculate the amplitudes of the 27-day variation of the GCR intensity based on the NMs data corrected and uncorrected for geomagnetic disturbances. We show that these amplitudes do not differ for NMs with cut-off rigidities smaller than 4-5 GV comparing with NMs of higher cut-off rigidities. Rigidity spectrum of the 27-day variation of the GCR intensity found in the uncorrected data is soft while it is hard in the case of the corrected data. For both cases exists definite tendency of softening the temporal changes of the 27-day variation's rigidity spectrum in period of 2005 to 2008 approaching the minimum of solar activity. We believe that a study of the 27-day variation of the GCR intensity based on the data uncorrected for geomagnetic disturbances should be carried out by NMs with cut-off rigidities smaller than 4-5 GV.

  2. Asynchronous event-based corner detection and matching.

    PubMed

    Clady, Xavier; Ieng, Sio-Hoi; Benosman, Ryad

    2015-06-01

    This paper introduces an event-based luminance-free method to detect and match corner events from the output of asynchronous event-based neuromorphic retinas. The method relies on the use of space-time properties of moving edges. Asynchronous event-based neuromorphic retinas are composed of autonomous pixels, each of them asynchronously generating "spiking" events that encode relative changes in pixels' illumination at high temporal resolutions. Corner events are defined as the spatiotemporal locations where the aperture problem can be solved using the intersection of several geometric constraints in events' spatiotemporal spaces. A regularization process provides the required constraints, i.e. the motion attributes of the edges with respect to their spatiotemporal locations using local geometric properties of visual events. Experimental results are presented on several real scenes showing the stability and robustness of the detection and matching. PMID:25828960

  3. Abstracting event-based control models for high autonomy systems

    NASA Technical Reports Server (NTRS)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  4. Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; Reddell, Brandon; Aumann, Aric R.

    2009-01-01

    The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.

  5. GCR-induced Photon Luminescence of the Moon: The Moon as a CR Detector

    NASA Technical Reports Server (NTRS)

    Wilson, Thomas L.; Lee, Kerry; Andersen, Vic

    2007-01-01

    We report on the results of a preliminary study of the GCR-induced photon luminescence of the Moon using the Monte Carlo program FLUKA. The model of the lunar surface is taken to be the chemical composition of soils found at various landing sites during the Apollo and Luna programs, averaged over all such sites to define a generic regolith for the present analysis. This then becomes the target that is bombarded by Galactic Cosmic Rays (GCRs) in FLUKA to determine the photon fluence when there is no sunshine or Earthshine. From the photon fluence we derive the energy spectrum which can be utilized to design an orbiting optical instrument for measuring the GCR-induced luminescence. This is to be distinguished from the gamma-ray spectrum produced by the radioactive decay of its radiogenic constituents lying in the surface and interior. Also, we investigate transient optical flashes from high-energy CRs impacting the lunar surface (boulders and regolith). The goal is to determine to what extent the Moon could be used as a rudimentary CR detector. Meteor impacts on the Moon have been observed for centuries to generate such flashes, so why not CRs?

  6. Elemental GCR Observations during the 2009-2010 Solar Minimum Period

    NASA Technical Reports Server (NTRS)

    Lave, K. A.; Israel, M. H.; Binns, W. R.; Christian, E. R.; Cummings, A. C.; Davis, A. J.; deNolfo, G. A.; Leske, R. A.; Mewaldt, R. A.; Stone, E. C.; vonRosenvinge, T. T.; Wiedenbeck, M. E.

    2013-01-01

    Using observations from the Cosmic Ray Isotope Spectrometer (CRIS) onboard the Advanced Composition Explorer (ACE), we present new measurements of the galactic cosmic ray (GCR) elemental composition and energy spectra for the species B through Ni in the energy range approx. 50-550 MeV/nucleon during the record setting 2009-2010 solar minimum period. These data are compared with our observations from the 1997-1998 solar minimum period, when solar modulation in the heliosphere was somewhat higher. For these species, we find that the intensities during the 2009-2010 solar minimum were approx. 20% higher than those in the previous solar minimum, and in fact were the highest GCR intensities recorded during the space age. Relative abundances for these species during the two solar minimum periods differed by small but statistically significant amounts, which are attributed to the combination of spectral shape differences between primary and secondary GCRs in the interstellar medium and differences between the levels of solar modulation in the two solar minima. We also present the secondary-to-primary ratios B/C and (Sc+Ti+V)/Fe for both solar minimum periods, and demonstrate that these ratios are reasonably well fit by a simple "leaky-box" galactic transport model that is combined with a spherically symmetric solar modulation model.

  7. Miniaturized Hollow-Waveguide Gas Correlation Radiometer (GCR) for Trace Gas Detection in the Martian Atmosphere

    NASA Technical Reports Server (NTRS)

    Wilson, Emily L.; Georgieva, E. M.; Melroy, H. R.

    2012-01-01

    Gas correlation radiometry (GCR) has been shown to be a sensitive and versatile method for detecting trace gases in Earth's atmosphere. Here, we present a miniaturized and simplified version of this instrument capable of mapping multiple trace gases and identifying active regions on the Mars surface. Reduction of the size and mass of the GCR instrument has been achieved by implementing a lightweight, 1 mm inner diameter hollow-core optical fiber (hollow waveguide) for the gas correlation cell. Based on a comparison with an Earth orbiting CO2 gas correlation instrument, replacement of the 10 meter mUltipass cell with hollow waveguide of equivalent pathlength reduces the cell mass from approx 150 kg to approx 0.5 kg, and reduces the volume from 1.9 m x 1.3 m x 0.86 m to a small bundle of fiber coils approximately I meter in diameter by 0.05 m in height (mass and volume reductions of >99%). This modular instrument technique can be expanded to include measurements of additional species of interest including nitrous oxide (N2O), hydrogen sulfide (H2S), methanol (CH3OH), and sulfur dioxide (SO2), as well as carbon dioxide (CO2) for a simultaneous measure of mass balance.

  8. Spatiotemporal features for asynchronous event-based data

    PubMed Central

    Lagorce, Xavier; Ieng, Sio-Hoi; Clady, Xavier; Pfeiffer, Michael; Benosman, Ryad B.

    2015-01-01

    Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the reliable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time. PMID:25759637

  9. A Multinomial Model of Event-Based Prospective Memory

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.

    2004-01-01

    Prospective memory is remembering to perform an action in the future. The authors introduce the 1st formal model of event-based prospective memory, namely, a multinomial model that includes 2 separate parameters related to prospective memory processes. The 1st measures preparatory attentional processes, and the 2nd measures retrospective memory…

  10. Secondary Cosmic Ray Particles Due to GCR Interactions in the Earth's Atmosphere

    SciTech Connect

    Battistoni, G.; Cerutti, F.; Fasso, A.; Ferrari, A.; Garzelli, M.V.; Lantz, M.; Muraro, S. Pinsky, L.S.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan

    2009-06-16

    Primary GCR interact with the Earth's atmosphere originating atmospheric showers, thus giving rise to fluxes of secondary particles in the atmosphere. Electromagnetic and hadronic interactions interplay in the production of these particles, whose detection is performed by means of complementary techniques in different energy ranges and at different depths in the atmosphere, down to the Earth's surface. Monte Carlo codes are essential calculation tools which can describe the complexity of the physics of these phenomena, thus allowing the analysis of experimental data. However, these codes are affected by important uncertainties, concerning, in particular, hadronic physics at high energy. In this paper we shall report some results concerning inclusive particle fluxes and atmospheric shower properties as obtained using the FLUKA transport and interaction code. Some emphasis will also be given to the validation of the physics models of FLUKA involved in these calculations.

  11. Nuclear fragmentation of GCR-like ions: comparisons between data and PHITS

    NASA Astrophysics Data System (ADS)

    Zeitlin, Cary; Guetersloh, Stephen; Heilbronn, Lawrence; Miller, Jack; Sihver, Lembit; Mancusi, Davide; Fukumura, Aki; Iwata, Yoshi; Murakami, Takeshi

    We present a summary of results from recent work in which we have compared nuclear fragmentation cross section data to predictions of the PHITS Monte Carlo simulation. The studies used beams of 12 C, 35 Cl, 40 Ar, 48 Ti, and 56 Fe at energies ranging from 290 MeV/nucleon to 1000 MeV/nucleon. Some of the data were obtained at the Brookhaven National Laboratory, others at the National Institute of Radiological Sciences in Japan. These energies and ion species are representative of the heavy ion component of the Galactic Cosmic Rays (GCR), which contribute significantly to the dose and dose equivalent that will be received by astronauts on deep-space missions. A critical need for NASA is the ability to accurately model the transport of GCR heavy ions through matter, including spacecraft walls, equipment racks, and other shielding materials, as well as through tissue. Nuclear interaction cross sections are of primary importance in the GCR transport problem. These interactions generally cause the incoming ion to break up (fragment) into one or more lighter ions, which continue approximately along the initial trajectory and with approximately the same velocity the incoming ion had prior to the interaction. Since the radiation dose delivered by a particle is proportional to the square of the quantity (charge/velocity), i.e., to (Z/β)2 , fragmentation reduces the dose (and, typically, dose equivalent) delivered by incident ions. The other mechanism by which dose can be reduced is ionization energy loss, which can lead to some particles stopping in the shielding. This is the conventional notion of shielding, but it is not applicable to human spaceflight, since the particles in the GCR tend to be highly energetic and because shielding must be relatively thin in order to keep overall mass as low as possible, keeping launch costs within reason. To support these goals, our group has systematically measured a large number of nuclear cross sections, intended to be used as either

  12. Solar Activity and GCR Particle Flux Variations: Assessment and Modeling with Ulysses and ACE/CRIS

    NASA Astrophysics Data System (ADS)

    Saganti, Premkumar

    Galactic Cosmic Ray (GCR) environment during the current and historically known lower solar minimum condition indicate some of the very high anticipated measurements of particle spectral data. Data from the Ulysses spacecraft in the polar orbit about the sun during the years 2004 and 2008 (about 5 AU) provided proton and alpha particle flux data and showed such anticipated high particle flux variations. Also, ACE/CRIS spacecraft data during the years 2007 and 2009 showed some of the high particle flux measurements of several heavy ions such as oxygen and iron. We present Ulysses and ACE/CRIS measured particle flux data and discuss their high density and variations in the context of the current low solar activity for depicting current space radiation environment.

  13. Microstructure of warm rolling and pearlitic transformation of ultrafine-grained GCr15 steel

    SciTech Connect

    Sun, Jun-Jie; Lian, Fu-Liang; Liu, Hong-Ji; Jiang, Tao; Guo, Sheng-Wu; Du, Lin-Xiu; Liu, Yong-Ning

    2014-09-15

    Pearlitic transformation mechanisms have been investigated in ultra-fine grained GCr15 steel. The ultrafine-grained steel, whose grain size was less than 1 μm, was prepared by thermo-mechanical treatment at 873 K and then annealing at 923 K for 2 h. Pearlitic transformation was conducted by reheating the ultra-fine grained samples at 1073 K and 1123 K for different periods of time and then cooling in air. Scanning electron microscope observation shows that normal lamellar pearlite, instead of granular cementite and ferrite, cannot be formed when the grain size is approximately less than 4(± 0.6) μm, which yields a critical grain size for normal lamellar pearlitic transformations in this chromium alloyed steel. The result confirms that grain size has a great influence on pearlitic transformation by increasing the diffusion rate of carbon atoms in the ultra-fine grained steel, and the addition of chromium element doesn't change this pearlitic phase transformation rule. Meanwhile, the grain growth rate is reduced by chromium alloying, which is beneficial to form fine grains during austenitizing, thus it facilitating pearlitic transformation by divorced eutectoid transformation. Moreover, chromium element can form a relatively high gradient in the frontier of the undissolved carbide, which promotes carbide formation in the frontier of the undissolved carbide, i.e., chromium promotes divorced eutectoid transformation. - Highlights: • Ultrafine-grained GCr15 steel was obtained by warm rolling and annealing technology. • Reduction of grain size makes pearlite morphology from lamellar to granular. • Adding Cr does not change normal pearlitic phase transformation rule in UFG steel. • Cr carbide resists grain growth and facilitates pearlitic transformation by DET.

  14. Mars Science Laboratory; A Model for Event-Based EPO

    NASA Astrophysics Data System (ADS)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  15. Mutations in GCR3, a gene involved in the expression of glycolytic genes in Saccharomyces cerevisiae, suppress the temperature-sensitive growth of hpr1 mutants

    SciTech Connect

    Uemura, Hiroshi; Jigami, Yoshifumi; Pandit, Sunil; Sternglanz, R.

    1996-04-01

    To study the functions of DNA topoisomerase I and Hpr1 protein, a suppressor mutant of the temperature-sensitive growth of an hpr1 top1-5{sup ts} double mutant was isolated. The isolated triple mutant showed cold-sensitive growth. By complementation of this phenotype, the suppressor gene was cloned. DNA sequencing showed it to be GCR3, a gene involved in the expression of glycol genes. Further analysis showed that gcr3 mutations also suppressed the temperature-sensitive growth of hpr1 single mutants. Experiments with gcr3 truncation mutants also suggested a genetic interaction between GCR3 and HPR1. The fact that top1 suppressed the growth defect of gcr3 suggested an interaction between those two genes also. Plasmid DNA isolated from gcr3 mutants was significantly more negatively supercoiled than normal, suggesting that Gcr3 protein, like topoisomerase I and Hpr1p, affects chromatin structure, perhaps during transcription. 43 refs., 2 figs., 6 tabs.

  16. GCR Transport in the Brain: Assessment of Self-Shielding, Columnar Damage, and Nuclear Reactions on Cell Inactivation Rates

    NASA Technical Reports Server (NTRS)

    Shavers, M. R.; Atwell, W.; Cucinotta, F. A.; Badhwar, G. D. (Technical Monitor)

    1999-01-01

    Radiation shield design is driven by the need to limit radiation risks while optimizing risk reduction with launch mass/expense penalties. Both limitation and optimization objectives require the development of accurate and complete means for evaluating the effectiveness of various shield materials and body-self shielding. For galactic cosmic rays (GCR), biophysical response models indicate that track structure effects lead to substantially different assessments of shielding effectiveness relative to assessments based on LET-dependent quality factors. Methods for assessing risk to the central nervous system (CNS) from heavy ions are poorly understood at this time. High-energy and charge (HZE) ion can produce tissue events resulting in damage to clusters of cells in a columnar fashion, especially for stopping heavy ions. Grahn (1973) and Todd (1986) have discussed a microlesion concept or model of stochastic tissue events in analyzing damage from HZE's. Some tissues, including the CNS, maybe sensitive to microlesion's or stochastic tissue events in a manner not illuminated by either conventional dosimetry or fluence-based risk factors. HZE ions may also produce important lateral damage to adjacent cells. Fluences of high-energy proton and alpha particles in the GCR are many times higher than HZE ions. Behind spacecraft and body self-shielding the ratio of protons, alpha particles, and neutrons to HZE ions increases several-fold from free-space values. Models of GCR damage behind shielding have placed large concern on the role of target fragments produced from tissue atoms. The self-shielding of the brain reduces the number of heavy ions reaching the interior regions by a large amount and the remaining light particle environment (protons, neutrons, deuterons. and alpha particles) may be the greatest concern. Tracks of high-energy proton produce nuclear reactions in tissue, which can deposit doses of more than 1 Gv within 5 - 10 cell layers. Information on rates of

  17. Event-Based Processing of Neutron Scattering Data

    SciTech Connect

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; Taylor, Russell J.; Zikovsky, Janik L.

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniques will be shown for comparison.

  18. Event-based Simulation Model for Quantum Optics Experiments

    SciTech Connect

    De Raedt, H.; Michielsen, K.

    2011-03-28

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified description of multiple-beam fringes of a plane parallel plate and single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum eraser, two-beam interference, double-slit, Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments. We also discuss the possibility to refute our corpuscular model.

  19. Assessment and Requirements of Nuclear Reaction Databases for GCR Transport in the Atmosphere and Structures

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Shinn, J. L.; Tripathi, R. K.

    1998-01-01

    The transport properties of galactic cosmic rays (GCR) in the atmosphere, material structures, and human body (self-shielding) am of interest in risk assessment for supersonic and subsonic aircraft and for space travel in low-Earth orbit and on interplanetary missions. Nuclear reactions, such as knockout and fragmentation, present large modifications of particle type and energies of the galactic cosmic rays in penetrating materials. We make an assessment of the current nuclear reaction models and improvements in these model for developing required transport code data bases. A new fragmentation data base (QMSFRG) based on microscopic models is compared to the NUCFRG2 model and implications for shield assessment made using the HZETRN radiation transport code. For deep penetration problems, the build-up of light particles, such as nucleons, light clusters and mesons from nuclear reactions in conjunction with the absorption of the heavy ions, leads to the dominance of the charge Z = 0, 1, and 2 hadrons in the exposures at large penetration depths. Light particles are produced through nuclear or cluster knockout and in evaporation events with characteristically distinct spectra which play unique roles in the build-up of secondary radiation's in shielding. We describe models of light particle production in nucleon and heavy ion induced reactions and make an assessment of the importance of light particle multiplicity and spectral parameters in these exposures.

  20. MHD compressor---expander conversion system integrated with GCR inside a deployable reflector

    SciTech Connect

    Tuninetti, G. . Research Div.); Botta, E.; Criscuolo, C.; Riscossa, P. . Nuclear Div.); Giammanco, F. . Dipt. di Fisica); Rosa-Clot, M. . Dipt. di Fisica)

    1989-04-20

    This work originates from the proposal MHD Compressor-Expander Conversion System Integrated with a GCR Inside a Deployable Reflector''. The proposal concerned an innovative concept of nuclear, closed-cycle MHD converter for power generation on space-based systems in the multi-megawatt range. The basic element of this converter is the Power Conversion Unit (PCU) consisting of a gas core reactor directly coupled to an MHD expansion channel. Integrated with the PCU, a deployable reflector provides reactivity control. The working fluid could be either uranium hexafluoride or a mixture of uranium hexafluoride and helium, added to enhance the heat transfer properties. The original Statement of Work, which concerned the whole conversion system, was subsequently redirected and focused on the basic mechanisms of neutronics, reactivity control, ionization and electrical conductivity in the PCU. Furthermore, the study was required to be inherently generic such that the study was required to be inherently generic such that the analysis an results can be applied to various nuclear reactor and/or MHD channel designs''.

  1. Event-Based Processing of Neutron Scattering Data

    DOE PAGESBeta

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; Taylor, Russell J.; Zikovsky, Janik L.

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less

  2. The yeast protein Gcr1p binds to the PGK UAS and contributes to the activation of transcription of the PGK gene.

    PubMed

    Henry, Y A; López, M C; Gibbs, J M; Chambers, A; Kingsman, S M; Baker, H V; Stanway, C A

    1994-11-15

    Analysis of the upstream activation sequence (UAS) of the yeast phosphoglycerate kinase gene (PGK) has demonstrated that a number of sequence elements are involved in its activity and two of these sequences are bound by the multifunctional factors Rap1p and Abf1p. In this report we show by in vivo footprinting that the regulatory factor encoded by GCR1 binds to two elements in the 3' half of the PGK UAS. These elements contain the sequence CTTCC, which was previously suggested to be important for the activity of the PGK UAS and has been shown to be able to bind Gcr1p in vitro. Furthermore, we find that Gcr1p positively influences PGK transcription, although it is not responsible for the carbon source dependent regulation of PGK mRNA synthesis. In order to mediate its transcriptional influence we find that Gcr1p requires the Rap1p binding site, in addition to its own, but not the Abf1p site. As neither a Rap1p nor a Gcr1p binding site alone is able to activate transcription, we propose that Gcr1p and Rap1p interact in an interdependent fashion to activate PGK transcription. PMID:7808400

  3. G-protein Signaling Components GCR1 and GPA1 Mediate Responses to Multiple Abiotic Stresses in Arabidopsis

    PubMed Central

    Chakraborty, Navjyoti; Singh, Navneet; Kaur, Kanwaljeet; Raghuram, Nandula

    2015-01-01

    G-protein signaling components have been implicated in some individual stress responses in Arabidopsis, but have not been comprehensively evaluated at the genetic and biochemical level. Stress emerged as the largest functional category in our whole transcriptome analyses of knock-out mutants of GCR1 and/or GPA1 in Arabidopsis (Chakraborty et al., 2015a,b). This led us to ask whether G-protein signaling components offer converging points in the plant's response to multiple abiotic stresses. In order to test this hypothesis, we carried out detailed analysis of the abiotic stress category in the present study, which revealed 144 differentially expressed genes (DEGs), spanning a wide range of abiotic stresses, including heat, cold, salt, light stress etc. Only 10 of these DEGs are shared by all the three mutants, while the single mutants (GCR1/GPA1) shared more DEGs between themselves than with the double mutant (GCR1-GPA1). RT-qPCR validation of 28 of these genes spanning different stresses revealed identical regulation of the DEGs shared between the mutants. We also validated the effects of cold, heat and salt stresses in all the 3 mutants and WT on % germination, root and shoot length, relative water content, proline content, lipid peroxidation and activities of catalase, ascorbate peroxidase and superoxide dismutase. All the 3 mutants showed evidence of stress tolerance, especially to cold, followed by heat and salt, in terms of all the above parameters. This clearly shows the role of GCR1 and GPA1 in mediating the plant's response to multiple abiotic stresses for the first time, especially cold, heat and salt stresses. This also implies a role for classical G-protein signaling pathways in stress sensitivity in the normal plants of Arabidopsis. This is also the first genetic and biochemical evidence of abiotic stress tolerance rendered by knock-out mutation of GCR1 and/or GPA1. This suggests that G-protein signaling pathway could offer novel common targets for the

  4. Determining the Magnitude of Neutron and Galactic Cosmic Ray (GCR) Fluxes at the Moon using the Lunar Exploration Neutron Detector during the Historic Space-Age Era of High GCR Flux

    NASA Astrophysics Data System (ADS)

    Chin, G.; Sagdeev, R.; Boynton, W. V.; Mitrofanov, I. G.; Milikh, G. M.; Su, J. J.; Livengood, T. A.; McClanahan, T. P.; Evans, L.; Starr, R. D.; litvak, M. L.; Sanin, A.

    2013-12-01

    The Lunar Reconnaissance Orbiter (LRO) was launched June 18, 2009 during an historic space-age era of minimum solar activity [1]. The lack of solar sunspot activity signaled a complex set of heliospheric phenomena [2,3,4] that also gave rise to a period of unprecedentedly high Galactic Cosmic Ray (GCR) flux [5]. These events coincided with the primary mission of the Lunar Exploration Neutron Detector (LEND, [6]), onboard LRO in a nominal 50-km circular orbit of the Moon [7]. Methods to calculate the emergent neutron albedo population using Monte Carlo techniques [8] rely on an estimate of the GCR flux and spectra calibrated at differing periods of solar activity [9,10,11]. Estimating the actual GCR flux at the Moon during the LEND's initial period of operation requires a correction using a model-dependent heliospheric transport modulation parameter [12] to adjust the GCR flux appropriate to this unique solar cycle. These corrections have inherent uncertainties depending on model details [13]. Precisely determining the absolute neutron and GCR fluxes is especially important in understanding the emergent lunar neutrons measured by LEND and subsequently in estimating the hydrogen/water content in the lunar regolith [6]. LEND is constructed with a set of neutron detectors to meet differing purposes [6]. Specifically there are two sets of detector systems that measure the flux of epithermal neutrons: a) the uncollimated Sensor for Epi-Thermal Neutrons (SETN) and b) the Collimated Sensor for Epi-Thermal Neutrons (CSETN). LEND SETN and CSETN observations form a complementary set of simultaneous measurements that determine the absolute scale of emergent lunar neutron flux in an unambiguous fashion and without the need for correcting to differing solar-cycle conditions. LEND measurements are combined with a detailed understanding of the sources of instrumental back-ground, and the performance of CSETN and SETN. This comparison allows us to calculate a constant scale factor

  5. Neuromorphic Event-Based 3D Pose Estimation

    PubMed Central

    Reverter Valeiras, David; Orchard, Garrick; Ieng, Sio-Hoi; Benosman, Ryad B.

    2016-01-01

    Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30–60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion. PMID:26834547

  6. Event-Based User Classification in Weibo Media

    PubMed Central

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  7. Event-based user classification in Weibo media.

    PubMed

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  8. A storm event-based approach to TMDL development.

    PubMed

    Hsu, Tsung-Hung; Lin, Jen-Yang; Lee, Tsu-Chuan; Zhang, Harry X; Yu, Shaw L

    2010-04-01

    It is vitally important to define the critical condition for a receiving water body in the total maximum daily load (TMDL) development process. One of the major disadvantages of using a continuous simulation approach is that there is no guarantee that the most critical condition will be covered within the subjectively selected representative hydrologic period, which is usually several years depending on the availability of data. Another limitation of the continuous simulation approach, compared to a design storm approach, is the lack of an estimate of the risk involved. Because of the above limitations, a storm event-based critical flow-storm (CFS) approach was previously developed to explicitly address the critical condition as a combination of a prescribed stream flow and a storm event of certain magnitude, both having a certain frequency of occurrence and when combined, would create a critical condition. The CFS approach was tested successfully in a TMDL study for Muddy Creek in Virginia. The present paper reports results of a comparative study on the applicability of the CFS approach in Taiwan. The Dy-yu creek watershed in northern Taiwan differs significantly from Muddy Creek in terms of climate, hydrology, terrain, and other characteristics. Results show that the critical condition for different watersheds might be also different, and that the CFS approach could clearly define that critical condition and should be considered as an alternative method for TMDL development to a continuous simulation approach. PMID:19266300

  9. Self-Driven Decay Heat Removal in a GCR Closed Brayton Cycle Power System

    SciTech Connect

    Wright, Steven A.; Lipinski, Ronald J.

    2006-07-01

    Closed Brayton Cycle (CBC) systems that are driven by Gas Cooled Reactors (GCR) are being evaluated for high-efficiency electricity generation. These systems were also selected by the Naval Reactor Prime Contractor team for use as space power systems. This paper describes the decay heat removal performance of these systems. A key question for such space or terrestrial based CBC systems is how to shut down the reactor while still removing the decay heat without using substantial amounts of auxiliary power. Tests in the Sandia Brayton Loop (SBL) show that the Brayton cycle is capable of operating on sensible heat for very long times ({approx} hour) after the reactor is shut down. This paper describes the measured and predicted results of generated electrical power produced as a function of time after the heat source had been turned off in the Sandia Brayton Loop. The measured results were obtained from an electrically heated closed Brayton cycle test loop (SBL) that Sandia fabricated and has operating within the laboratories. The predicted behavior is based on integrated dynamic system models that are capable of predicting both the transient and steady state behavior of nuclear heated or electrically heated Brayton cycle systems. The measured data was obtained by running the SBL and shutting off the electrical heater while adjusting the flow through the loop to keep the system operating at (or just above) its self-sustaining operating power level. During the test we were able to produce {approx}500 W of power for over 73 minutes after the heater power was turned off. Thus the Brayton loop was able to operate at self-sustaining conditions (or better) for over one hour. During this time the turbo-compressor was transporting the sensible heat in the heater, ducting, and recuperator to the waste heat rejection system for over an hour. For a reactor-driven system in space, this would give the shutdown decay power sufficient time to decay to levels where it could be

  10. Event-based internet biosurveillance: relation to epidemiological observation

    PubMed Central

    2012-01-01

    Background The World Health Organization (WHO) collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1) 2009 (April 2009 – January 2010) from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p < 0.05) correlations with WHO case count data; the sum of these 5 categories was highly correlated with WHO case data (r = 0.81, p < 0.0001), with expected differences observed among the 8 regions. Argus reported first confirmed cases on the same day as WHO for 21 of the first 64 countries reporting cases, and 1 to 16 days (average 1.5 days) ahead of WHO for 42 of those countries. Conclusion Confirmed pandemic (H1N1) 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events. PMID:22709988

  11. The bacterial cell cycle regulator GcrA is a σ70 cofactor that drives gene expression from a subset of methylated promoters.

    PubMed

    Haakonsen, Diane L; Yuan, Andy H; Laub, Michael T

    2015-11-01

    Cell cycle progression in most organisms requires tightly regulated programs of gene expression. The transcription factors involved typically stimulate gene expression by binding specific DNA sequences in promoters and recruiting RNA polymerase. Here, we found that the essential cell cycle regulator GcrA in Caulobacter crescentus activates the transcription of target genes in a fundamentally different manner. GcrA forms a stable complex with RNA polymerase and localizes to almost all active σ(70)-dependent promoters in vivo but activates transcription primarily at promoters harboring certain DNA methylation sites. Whereas most transcription factors that contact σ(70) interact with domain 4, GcrA interfaces with domain 2, the region that binds the -10 element during strand separation. Using kinetic analyses and a reconstituted in vitro transcription assay, we demonstrated that GcrA can stabilize RNA polymerase binding and directly stimulate open complex formation to activate transcription. Guided by these studies, we identified a regulon of ∼ 200 genes, providing new insight into the essential functions of GcrA. Collectively, our work reveals a new mechanism for transcriptional regulation, and we discuss the potential benefits of activating transcription by promoting RNA polymerase isomerization rather than recruitment exclusively. PMID:26545812

  12. The bacterial cell cycle regulator GcrA is a σ70 cofactor that drives gene expression from a subset of methylated promoters

    PubMed Central

    Haakonsen, Diane L.; Yuan, Andy H.; Laub, Michael T.

    2015-01-01

    Cell cycle progression in most organisms requires tightly regulated programs of gene expression. The transcription factors involved typically stimulate gene expression by binding specific DNA sequences in promoters and recruiting RNA polymerase. Here, we found that the essential cell cycle regulator GcrA in Caulobacter crescentus activates the transcription of target genes in a fundamentally different manner. GcrA forms a stable complex with RNA polymerase and localizes to almost all active σ70-dependent promoters in vivo but activates transcription primarily at promoters harboring certain DNA methylation sites. Whereas most transcription factors that contact σ70 interact with domain 4, GcrA interfaces with domain 2, the region that binds the −10 element during strand separation. Using kinetic analyses and a reconstituted in vitro transcription assay, we demonstrated that GcrA can stabilize RNA polymerase binding and directly stimulate open complex formation to activate transcription. Guided by these studies, we identified a regulon of ∼200 genes, providing new insight into the essential functions of GcrA. Collectively, our work reveals a new mechanism for transcriptional regulation, and we discuss the potential benefits of activating transcription by promoting RNA polymerase isomerization rather than recruitment exclusively. PMID:26545812

  13. Event based climatology of extreme precipitation in Europe

    NASA Astrophysics Data System (ADS)

    Nissen, Katrin M.; Becker, Nico; Ulbrich, Uwe

    2015-04-01

    An event based detection algorithm to identify extreme precipitation events in gridded data sets is introduced and applied to the observational E-OBS data set. The algorithm identifies all grid boxes in which the rainfall exceeds a threshold, which depends on the location and the aggregation period. The aggregation periods taken into account in this study range from a single time step up to 72 hours. The local 50-year return level is calculated for all aggregation periods and used as a threshold. All identified grid boxes which are located within the same continuous rain area (i.e. which are not separated by rain free grid boxes) are considered as belonging to the same event and form a cluster. The centre of mass is calculated for each cluster. The clusters are then tracked in time using a nearest neighbor approach. Thus, each detected event can consist of several grid boxes and can last for several time steps. A precipitation severity index (PSI) is assigned to the events. The severity index takes the affected area and the amount of precipitation accumulated over the duration of the event into account. It is normalized by the long-term mean annual precipitation sum expected for the grid box. The severity index can be used to compare the strength of the identified events. The detection algorithm also stores additional information for each event, such as the date, location, affected area, duration, severity and maximum precipitation. Comparing all events detected in the E-OBS data set, which exceeded the local 50-year return levels, the highest severity index was calculated for an event affecting Spain, which took place in November 1997. It had a severity index of 49.9 and was also described in the literature. In comparison, the average PSI for the extreme precipitation events over Europe is 2.4. Overall, the most active season for extreme precipitation in Europe is summer. The longest duration of an event in the data set was 11 days. It occurred over Estonia in

  14. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  15. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  16. Gold Medal! An Event-Based Science Module. Student Edition. Physiology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  17. Gold Medal!: An Event-Based Science Module. Teacher's Guide. Physiology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  18. Toxic Leak!: An Event-Based Science Module. Student Edition. Groundwater Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for the middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  19. Blight! An Event-Based Science Module. Teacher's Guide. Plants and Plant Diseases Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  20. Blight! An Event-Based Science Module. Student Edition. Plants and Plant Diseases Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  1. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  2. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  3. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  4. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  5. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  6. Fire!: An Event-Based Science Module. Student Edition. Chemistry and Fire Ecology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  7. Fraud! An Event-Based Science Module. Teacher's Guide. Chemistry Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  8. The Source of Adult Age Differences in Event-Based Prospective Memory: A Multinomial Modeling Approach

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.

    2006-01-01

    Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…

  9. First Flight!: An Event-Based Science Module. Student Edition. Physics Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  10. First Flight!: An Event-Based Science Module Teacher's Guide. Physics Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  11. Survive? An Event-Based Science Module. Teacher's Guide. Animals and Adaptation Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  12. Blackout!: An Event-Based Science Module. Teacher's Guide. Electricity and Solar Activity Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  13. Fraud! An Event-Based Science Module. Student Edition. Chemistry Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  14. Tornado! An Event-Based Science Module. Student Edition. Meteorology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  15. Tornado! An Event-Based Science Module. Teacher's Guide. Meteorology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about problems with tornadoes and scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning,…

  16. Earthquake!: An Event-Based Science Module. Teacher's Guide. Earth Science Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about earthquakes and scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  17. Earthquake!: An Event-Based Science Module. Student Edition. Earth Science Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  18. Gold Rush!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  19. Gold Rush!: An Event-Based Science Module. Student Edition. Rocks and Minerals Module.

    ERIC Educational Resources Information Center

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  20. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    SciTech Connect

    1995-03-31

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization.

  1. The regular measurements of the GCR intensity in the stratosphere in comparison with the measurements by the neutron monitors and aboard the IMP8 spacecraft

    NASA Astrophysics Data System (ADS)

    Krainev, M.B.; Belov, A.V.; Gushchina, R.T.; Yanke, V.G.

    We estimate to what extent the neutron monitor and stratospheric GCR data can be used for getting information on the intensity of the GCRs in a so called medium energy range (100-500 MeV/n), very important for studying the heliosphere and the GCR modulation there. The hourly data of the neutron monitors Apatity (since 1969) and Moscow (since 1958) are used as well as the standard set of the quiet time daily medium energy GCR intensities (p, 121-229.5 MeV; He, 168.8-455.5 MeV/n) and the integral count rate for GCR nuclei with ? > 80 MeV/n, kindly put at our disposal by Drs. F. McDonald and B. Heikkila (GSFC, USA). As stratospheric data we use the daily and monthly count rates in the Pfoetzer maxima at the high (Murmansk; the geomagnetic cut-off rigidity Rc = 0.6 GV) and the middle (Moscow; Rc = 2.3 GV) latitudes and the difference between these two count rates. Special emphasis is placed upon the long-term trends in all time series studied.

  2. Assessing the continuum of event-based biosurveillance through an operational lens.

    PubMed

    Corley, Courtney D; Lancaster, Mary J; Brigantic, Robert T; Chung, James S; Walters, Ronald A; Arthur, Ray R; Bruckner-Lea, Cynthia J; Calapristi, Augustin; Dowling, Glenn; Hartley, David M; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K; McKenzie, Taylor; Nelson, Noele P; Olsen, Jennifer; Pancerella, Carmen; Quitugua, Teresa N; Reed, Jeremy Todd; Thomas, Carla S

    2012-03-01

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have a significant impact on the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance technologies is unclear. This article frames the continuum of event-based biosurveillance systems (that fuse media reports from the internet), models (ie, computational that forecast disease occurrence), and constructs (ie, descriptive analytical reports) through an operational lens (ie, aspects and attributes associated with operational considerations in the development, testing, and validation of the event-based biosurveillance methods and models and their use in an operational environment). A workshop was held in 2010 to scientifically identify, develop, and vet a set of attributes for event-based biosurveillance. Subject matter experts were invited from 7 federal government agencies and 6 different academic institutions pursuing research in biosurveillance event detection. We describe 8 attribute families for the characterization of event-based biosurveillance: event, readiness, operational aspects, geographic coverage, population coverage, input data, output, and cost. Ultimately, the analyses provide a framework from which the broad scope, complexity, and relevant issues germane to event-based biosurveillance useful in an operational environment can be characterized. PMID:22320664

  3. Spatial gradients of GCR protons in the inner heliosphere derived from Ulysses COSPIN/KET and PAMELA measurements

    NASA Astrophysics Data System (ADS)

    Gieseler, J.; Heber, B.

    2016-05-01

    Context. During the transition from solar cycle 23 to 24 from 2006 to 2009, the Sun was in an unusual solar minimum with very low activity over a long period. These exceptional conditions included a very low interplanetary magnetic field (IMF) strength and a high tilt angle, which both play an important role in the modulation of galactic cosmic rays (GCR) in the heliosphere. Thus, the radial and latitudinal gradients of GCRs are very much expected to depend not only on the solar magnetic epoch, but also on the overall modulation level. Aims: We determine the non-local radial and the latitudinal gradients of protons in the rigidity range from ~0.45 to 2 GV. Methods: This was accomplished by using data from the satellite-borne experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) at Earth and the Kiel Electron Telescope (KET) onboard Ulysses on its highly inclined Keplerian orbit around the Sun with the aphelion at Jupiter's orbit. Results: In comparison to the previous A> 0 solar magnetic epoch, we find that the absolute value of the latitudinal gradient is lower at higher and higher at lower rigidities. This energy dependence is therefore a crucial test for models that describe the cosmic ray transport in the inner heliosphere.

  4. In vitro Manganese-Dependent Cross-Talk between Streptococcus mutans VicK and GcrR: Implications for Overlapping Stress Response Pathways

    PubMed Central

    Downey, Jennifer S.; Mashburn-Warren, Lauren; Ayala, Eduardo A.; Senadheera, Dilani B.; Hendrickson, Whitney K.; McCall, Lathan W.; Sweet, Julie G.; Cvitkovitch, Dennis G.; Spatafora, Grace A.; Goodman, Steven D.

    2014-01-01

    Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR) to modulate stress response pathways. PMID:25536343

  5. KlGcr1 controls glucose-6-phosphate dehydrogenase activity and responses to H2O2, cadmium and arsenate in Kluyveromyces lactis.

    PubMed

    Lamas-Maceiras, Mónica; Rodríguez-Belmonte, Esther; Becerra, Manuel; González-Siso, Ma Isabel; Cerdán, Ma Esperanza

    2015-09-01

    It has been previously reported that Gcr1 differentially controls growth and sugar utilization in Saccharomyces cerevisiae and Kluyveromyces lactis, although the regulatory mechanisms causing activation of glycolytic genes are conserved (Neil et al., 2004). We have found that KlGCR1 deletion diminishes glucose consumption and ethanol production, but increases resistance to oxidative stress caused by H2O2, cadmium and arsenate, glucose 6P dehydrogenase activity, and the NADPH/NADP(+) and GSH/GSSG ratios in K. lactis. The gene KlZWF1 that encodes for glucose 6P dehydrogenase, the first enzyme in the pentose phosphate pathway, is transcriptionally regulated by KlGcr1. The high resistance to oxidative stress observed in the ΔKlgcr1 mutant strain, could be explained as a consequence of an increased flux of glucose through the pentose phosphate pathway. Since mitochondrial respiration decreases in the ΔKlgcr1 mutant (García-Leiro et al., 2010), the reoxidation of the NADPH, produced through the pentose phosphate pathway, has to be achieved by the reduction of other molecules implied in the defense against oxidative stress, like GSSG. The higher GSH/GSSG ratio in the mutant would explain its phenotype of increased resistance to oxidative stress. PMID:26164373

  6. Use of glucose consumption rate (GCR) as a tool to monitor and control animal cell production processes in packed-bed bioreactors.

    PubMed

    Meuwly, F; Papp, F; Ruffieux, P-A; Bernard, A R; Kadouri, A; von Stockar, U

    2006-03-01

    For animal cell cultures growing in packed-bed bioreactors where cell number cannot be determined directly, there is a clear need to use indirect methods that are not based on cell counts in order to monitor and control the process. One option is to use the glucose consumption rate (GCR) of the culture as an indirect measure to monitor the process in bioreactors. This study was done on a packed-bed bioreactor process using recombinant CHO cells cultured on Fibra-Cel disk carriers in perfusion mode at high cell densities. A key step in the process is the switch of the process from the cell growth phase to the production phase triggered by a reduction of the temperature. In this system, we have used a GCR value of 300 g of glucose per kilogram of disks per day as a criterion for the switch. This paper will present results obtained in routine operations for the monitoring and control of an industrial process at pilot-scale. The process operated with this GCR-based strategy yielded consistent, reproducible process performance across numerous bioreactor runs performed on multiple production sites. PMID:16153735

  7. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  8. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    PubMed

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. PMID:27295204

  9. An Observation on the Spontaneous Noticing of Prospective Memory Event-Based Cues

    ERIC Educational Resources Information Center

    Knight, Justin B.; Meeks, J. Thadeus; Marsh, Richard L.; Cook, Gabriel I.; Brewer, Gene A.; Hicks, Jason L.

    2011-01-01

    In event-based prospective memory, current theories make differing predictions as to whether intention-related material can be spontaneously noticed (i.e., noticed without relying on preparatory attentional processes). In 2 experiments, participants formed an intention that was contextually associated to the final phase of the experiment, and…

  10. The Cost of Event-Based Prospective Memory: Salient Target Events

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Hunt, R. Reed; McVay, Jennifer C.; McConnell, Melissa D.

    2007-01-01

    Evidence has begun to accumulate showing that successful performance of event-based prospective memory (PM) comes at a cost to other ongoing activities. The current study builds on previous work by examining the cost associated with PM when the target event is salient. Target salience is among the criteria for automatic retrieval of intentions…

  11. The role of musical training in emergent and event-based timing

    PubMed Central

    Baer, L. H.; Thibodeau, J. L. N.; Gralnick, T. M.; Li, K. Z. H.; Penhune, V. B.

    2013-01-01

    Introduction: Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. Materials and Methods: We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced) and then responded at the same rate without the metronome (Unpaced). Results: Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. Discussion: These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes. PMID:23717275

  12. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    PubMed

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-01-01

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy. PMID:26633412

  13. Event-based prospective memory and executive control of working memory.

    PubMed

    Marsh, R L; Hicks, J L

    1998-03-01

    In 5 experiments, the character of concurrent cognitive processing was manipulated during an event-based prospective memory task. High- and low-load conditions that differed only in the difficulty of the concurrent task were tested in each experiment. In Experiments 1 and 2, attention-demanding tasks from the literature on executive control produced decrements in prospective memory. In Experiment 3, attention was divided by different loads of articulatory suppression that did not ultimately lead to decrements in prospective memory. A high-load manipulation of a visuospatial task requiring performance monitoring resulted in worse prospective memory in Experiment 4, whereas in Experiment 5 a visuospatial task with little monitoring did not. Results are discussed in terms of executive functions, such as planning and monitoring, that appear to be critical to successful event-based prospective memory. PMID:9530843

  14. Use of Unstructured Event-Based Reports for Global Infectious Disease Surveillance

    PubMed Central

    Blench, Michael; Tolentino, Herman; Freifeld, Clark C.; Mandl, Kenneth D.; Mawudeku, Abla; Eysenbach, Gunther; Brownstein, John S.

    2009-01-01

    Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting. PMID:19402953

  15. Event-Based Control Strategy for Mobile Robots in Wireless Environments

    PubMed Central

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-01-01

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy. PMID:26633412

  16. NASA Space Radiation Program Integrative Risk Model Toolkit

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  17. An Event-Based Solution to the Perspective-n-Point Problem

    PubMed Central

    Reverter Valeiras, David; Kime, Sihem; Ieng, Sio-Hoi; Benosman, Ryad Benjamin

    2016-01-01

    The goal of the Perspective-n-Point problem (PnP) is to find the relative pose between an object and a camera from a set of n pairings between 3D points and their corresponding 2D projections on the focal plane. Current state of the art solutions, designed to operate on images, rely on computationally expensive minimization techniques. For the first time, this work introduces an event-based PnP algorithm designed to work on the output of a neuromorphic event-based vision sensor. The problem is formulated here as a least-squares minimization problem, where the error function is updated with every incoming event. The optimal translation is then computed in closed form, while the desired rotation is given by the evolution of a virtual mechanical system whose energy is proven to be equal to the error function. This allows for a simple yet robust solution of the problem, showing how event-based vision can simplify computer vision tasks. The approach takes full advantage of the high temporal resolution of the sensor, as the estimated pose is incrementally updated with every incoming event. Two approaches are proposed: the Full and the Efficient methods. These two methods are compared against a state of the art PnP algorithm both on synthetic and on real data, producing similar accuracy in addition of being faster. PMID:27242412

  18. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    PubMed

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase. PMID:25837726

  19. The Effects of Age and Cue-Action Reminders on Event-Based Prospective Memory Performance in Preschoolers

    ERIC Educational Resources Information Center

    Kliegel, Matthias; Jager, Theodor

    2007-01-01

    The present study investigated event-based prospective memory in five age groups of preschoolers (i.e., 2-, 3-, 4-, 5-, and 6-year-olds). Applying a laboratory-controlled prospective memory procedure, the data showed that event-based prospective memory performance improves across the preschool years, at least between 3 and 6 years of age. However,…

  20. Improvement of glucose uptake rate and production of target chemicals by overexpressing hexose transporters and transcriptional activator Gcr1 in Saccharomyces cerevisiae.

    PubMed

    Kim, Daehee; Song, Ji-Yoon; Hahn, Ji-Sook

    2015-12-01

    Metabolic engineering to increase the glucose uptake rate might be beneficial to improve microbial production of various fuels and chemicals. In this study, we enhanced the glucose uptake rate in Saccharomyces cerevisiae by overexpressing hexose transporters (HXTs). Among the 5 tested HXTs (Hxt1, Hxt2, Hxt3, Hxt4, and Hxt7), overexpression of high-affinity transporter Hxt7 was the most effective in increasing the glucose uptake rate, followed by moderate-affinity transporters Hxt2 and Hxt4. Deletion of STD1 and MTH1, encoding corepressors of HXT genes, exerted differential effects on the glucose uptake rate, depending on the culture conditions. In addition, improved cell growth and glucose uptake rates could be achieved by overexpression of GCR1, which led to increased transcription levels of HXT1 and ribosomal protein genes. All genetic modifications enhancing the glucose uptake rate also increased the ethanol production rate in wild-type S. cerevisiae. Furthermore, the growth-promoting effect of GCR1 overexpression was successfully applied to lactic acid production in an engineered lactic acid-producing strain, resulting in a significant improvement of productivity and titers of lactic acid production under acidic fermentation conditions. PMID:26431967

  1. Measuring pesticides in surface waters - continuous versus event-based sampling design

    NASA Astrophysics Data System (ADS)

    Eyring, J.; Bach, M.; Frede, H.-G.

    2009-04-01

    Monitoring pesticides in surface waters is still a work- and cost-intensive procedure. Therefore, studies are normally carried out with a low monitoring frequency or with only a small selection of substances to be analyzed. In this case, it is not possible to picture the high temporal variability of pesticide concentrations, depending on application dates, weather conditions, cropping seasons and other factors. In 2007 the Institute of Landscape Ecology and Resource Management at Giessen University implemented a monitoring program during two pesticide application periods aiming to produce a detailed dataset of pesticide concentration for a wide range of substances, and which would also be suitable for the evaluation of catchment-scale pesticide exposure models. The Weida catchment in Thuringia (Eastern Germany) was selected as study area due to the availability of detailed pesticide application data for this region. The samples were taken from the river Weida at the gauge Zeulenroda, where it flows into a drinking water reservoir. The catchment area is 102 km². 67% of the area are in agricultural use, the main crops being winter wheat, maize, winter barley and winter rape. Dominant soil texture classes are loamy sand and loamy silt. About one third of the agricultural area is drained. The sampling was carried out in cooperation with the water supply agency of Thuringia (Fernwasserversorgung Thueringen). The sample analysis was done by the Institute of Environmental Research at Dortmund University. Two sampling schemes were carried out using two automatic samplers: continuous sampling with composite samples bottled two times per week and event-based sampling triggered by a discharge threshold. 53 samples from continuous sampling were collected. 19 discharge events were sampled with 45 individual samples (one to six per event). 34 pesticides and two metabolites were analyzed. 21 compounds were detected, nine of which having concentrations above the drinking water

  2. Event-based prospective memory deficits in individuals with high depressive symptomatology: problems controlling attentional resources?

    PubMed

    Li, Yanqi Ryan; Loft, Shayne; Weinborn, Michael; Maybery, Murray T

    2014-01-01

    Depression has been found to be related to neurocognitive deficits in areas important to successful prospective memory (PM) performance, including executive function, attention, and retrospective memory. However, research specific to depression and PM has produced a mixed pattern of results. The current study further examined the task conditions in which event-based PM deficits may emerge in individuals with high depressive symptomatology (HDS) relative to individuals with low depressive symptomatology (LDS) and the capacity of HDS individuals to allocate attentional resources to event-based PM tasks. Sixty-four participants (32 HDS, 32 LDS) were required to make a PM response when target words were presented during an ongoing lexical decision task. When the importance of the ongoing task was emphasized, response time costs to the ongoing task, and PM accuracy, did not differ between the HDS and LDS groups. This finding is consistent with previous research demonstrating that event-based PM task accuracy is not always impaired by depression, even when the PM task is resource demanding. When the importance of the PM task was emphasized, costs to the ongoing task further increased for both groups, indicating an increased allocation of attentional resources to the PM task. Crucially, while a corresponding improvement in PM accuracy was observed in the LDS group when the importance of the PM task was emphasized, this was not true for the HDS group. The lack of improved PM accuracy in the HDS group compared with the LDS group despite evidence of increased cognitive resources allocated to PM tasks may have been due to inefficiency in the application of the allocated attention, a dimension likely related to executive function difficulties in depression. Qualitatively different resource allocation patterns may underlie PM monitoring in HDS versus LDS individuals. PMID:24848441

  3. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    NASA Technical Reports Server (NTRS)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  4. Event-based plausibility immediately influences on-line language comprehension.

    PubMed

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge. PMID:21517222

  5. Event-based and Multi Agent Control of an Innovative Wheelchair

    NASA Astrophysics Data System (ADS)

    Diomin, U.; Witczak, P.; Stetter, R.

    2015-11-01

    Due to the aging population more and more people require mobility assistance in form of a wheelchair. Generally it would be desirable that such wheelchairs would be easy to use and would allow their users the possibility to move in any direction at any time. Concepts which allow such movements are existing since many years but have for several reasons not found their way to the market. Additionally for semi-autonomous (assisted) operation and fully autonomous operation (e. g. an empty wheelchair driving to its charging station) the control task is much less challenging for such drive system, because no complex manoeuvres needs to be considered and planned. In an ongoing research a drive system for a wheelchair was developed which offers such possibilities employing a relatively simple mechanical design. This drive system is based on a certain steering principle which is based on torque differences between different wheels. This allows a relatively simple mechanical design but poses challenges on the control of the vehicle. This paper describes two possible approaches to address this challenge - the use of an event based control and the application of multiple software agents. Both approaches can solve the control problem individually but can also complement each other for better system performance. The paper stars with a description of the wheelchair drive system. Then the asynchronous event based control software is described as well the multi agent based approach. The next sections report the results of the experiments and discuss the further improvements.

  6. A Markovian event-based framework for stochastic spiking neural networks.

    PubMed

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks. PMID:21499739

  7. Event-based Plausibility Immediately Influences On-line Language Comprehension

    PubMed Central

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  8. Event-Based Parareal: A data-flow based implementation of parareal

    SciTech Connect

    Berry, Lee A; Elwasif, Wael R; Reynolds-Barredo, J.; Samaddar, D.; Sanchez, R.; Newman, David E; Chen, J.

    2012-01-01

    Parareal is an iterative algorithm that, in effect, achieves temporal decomposition for a time-dependent system of differential or partial differential equations. A solution is obtained in a shorter wall-clock time, but at the expense of increased compute cycles. The algorithm combines a fine solver that solves the system to acceptable accuracy with an approximate coarse solver. The critical task for the successful implementation of parareal on any system is the development of a coarse solver that leads to convergence in a small number of iterations compared to the number of time slices in the full time interval, and is, at the same time, much faster than the fine solver. Fast coarse solvers may not lead to sufficiently rapid convergence, and slow coarse solvers may not lead to significant gains even if the number of iterations to convergence is satisfactory. We find that the difficulty of meeting these conflicting demands can be substantially eased by using a data-driven, event-based implementation of parareal instead of the conventional algorithm where solver tasks are executed sequentially. For given convergence properties, the event-based approach relaxes the speed requirements on the coarse solver by a factor of , where is the number of iterations required for a converged solution. This may, for many problems, lead to an efficient parareal implementation that would otherwise not be possible or would require substantial coarse solver development.

  9. Specific components of the SAGA complex are required for Gcn4- and Gcr1-mediated activation of the his4-912delta promoter in Saccharomyces cerevisiae.

    PubMed Central

    Dudley, A M; Gansheroff, L J; Winston, F

    1999-01-01

    Mutations selected as suppressors of Ty or solo delta insertion mutations in Saccharomyces cerevisiae have identified several genes, SPT3, SPT7, SPT8, and SPT20, that encode components of the SAGA complex. However, the mechanism by which SAGA activates transcription of specific RNA polymerase II-dependent genes is unknown. We have conducted a fine-structure mutagenesis of one widely used SAGA-dependent promoter, the delta element of his4-912delta, to identify sequence elements important for its promoter activity. Our analysis has characterized three delta regions necessary for full promoter activity and accurate start site selection: an upstream activating sequence, a TATA region, and an initiator region. In addition, we have shown that factors present at the adjacent UASHIS4 (Gcn4, Bas1, and Pho2) also activate the delta promoter in his4-912delta. Our results suggest a model in which the delta promoter in his4-912delta is primarily activated by two factors: Gcr1 acting at the UASdelta and Gcn4 acting at the UASHIS4. Finally, we tested whether activation by either of these factors is dependent on components of the SAGA complex. Our results demonstrate that Spt3 and Spt20 are required for full delta promoter activity, but that Gcn5, another member of SAGA, is not required. Spt3 appears to be partially required for activation of his4-912delta by both Gcr1 and Gcn4. Thus, our work suggests that SAGA exerts a large effect on delta promoter activity through a combination of smaller effects on multiple factors. PMID:10101163

  10. An efficient hybrid causative event-based approach for deriving the annual flood frequency distribution

    NASA Astrophysics Data System (ADS)

    Thyer, Mark; Li, Jing; Lambert, Martin; Kuczera, George; Metcalfe, Andrew

    2015-04-01

    Flood extremes are driven by highly variable and complex climatic and hydrological processes. Derived flood frequency methods are often used to predict the flood frequency distribution (FFD) because they can provide predictions in ungauged catchments and evaluate the impact of land-use or climate change. This study presents recent work on development of a new derived flood frequency method called the hybrid causative events (HCE) approach. The advantage of the HCE approach is that it combines the accuracy of the continuous simulation approach with the computational efficiency of the event-based approaches. Derived flood frequency methods, can be divided into two classes. Event-based approaches provide fast estimation, but can also lead to prediction bias due to limitations of inherent assumptions required for obtaining input information (rainfall and catchment wetness) for events that cause large floods. Continuous simulation produces more accurate predictions, however, at the cost of massive computational time. The HCE method uses a short continuous simulation to provide inputs for a rainfall-runoff model running in an event-based fashion. A proof-of-concept pilot study that the HCE produces estimates of the flood frequency distribution with similar accuracy as the continuous simulation, but with dramatically reduced computation time. Recent work incorporated seasonality into the HCE approach and evaluated with a more realistic set of eight sites from a wide range of climate zones, typical of Australia, using a virtual catchment approach. The seasonal hybrid-CE provided accurate predictions of the FFD for all sites. Comparison with the existing non-seasonal hybrid-CE showed that for some sites the non-seasonal hybrid-CE significantly over-predicted the FFD. Analysis of the underlying cause of whether a site had a high, low or no need to use seasonality found it was based on a combination of reasons, that were difficult to predict apriori. Hence it is recommended

  11. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    PubMed

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. PMID:26829603

  12. Event-Based Study of the Effect of Execution Environments on Parallel Program Performance

    NASA Technical Reports Server (NTRS)

    Sarukkai, Sekhar R.; Yan, Jerry C.; Craw, James (Technical Monitor)

    1995-01-01

    In this paper we seek to demonstrate the importance of studying the effect of changes in execution environment parameters, on parallel applications executed on state-of-the-art multiprocessors. A comprehensive methodology for event-based analysis of program behavior is introduced. This methodology is used to study the performance significance of various system parameters such as processor speed, message-buffer size, buffer copy speed, network bandwidth, communication latency, interrupt overheads and other system parameters. With the help cf a few CFD examples, we illustrate the use of our technique in determining suitable parameter values of the execution environment for three applications. We also demonstrate how this approach can be used to predict performance across architectures and illustrate the use of visual and profile-like feedback to expose the effect of system parameters changes on the performance of specific applications module.

  13. Event based self-supervised temporal integration for multimodal sensor data.

    PubMed

    Barakova, Emilia I; Lourens, Tino

    2005-06-01

    A method for synergistic integration of multimodal sensor data is proposed in this paper. This method is based on two aspects of the integration process: (1) achieving synergistic integration of two or more sensory modalities, and (2) fusing the various information streams at particular moments during processing. Inspired by psychophysical experiments, we propose a self-supervised learning method for achieving synergy with combined representations. Evidence from temporal registration and binding experiments indicates that different cues are processed individually at specific time intervals. Therefore, an event-based temporal co-occurrence principle is proposed for the integration process. This integration method was applied to a mobile robot exploring unfamiliar environments. Simulations showed that integration enhanced route recognition with many perceptual similarities; moreover, they indicate that a perceptual hierarchy of knowledge about instant movement contributes significantly to short-term navigation, but that visual perceptions have bigger impact over longer intervals. PMID:15988800

  14. Comparing Event-Based Storage across Tropical Land-Cover Gradients in the Panama Canal Watershed

    NASA Astrophysics Data System (ADS)

    Litt, G.; Gardner, C.; Ogden, F. L.; Lyons, W. B.

    2015-12-01

    Quantifying watershed storage is a difficult undertaking, yet serves as a critical state variable for understanding catchment runoff processes and modeling applications. Hydrograph recession rates have traditionally been used to characterize relative watershed storage characteristics, but may neglect the different flowpaths taken during recession periods. Separating event-based hydrograph recession into 'old' and 'new' water can help distinguish between different catchment storages (Stewart, 2015). By coupling geochemical tracer-based hydrograph separation with post-storm recession rates in humid tropical catchments (<185 ha), we test the hypothesis that both 'old' and 'new' water stores are greater in forested areas relative to disturbed land covers. Stewart, M. K. (2015), Promising new baseflow separation and recession analysis methods applied to streamflow at Glendhu Catchment, New Zealand. Hydrol. Earth Syst. Sci., 19(6), 2587-2603.

  15. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  16. Event-based surveillance in north-western Ethiopia: experience and lessons learnt in the field.

    PubMed

    Toyama, Yumi; Ota, Masaki; Beyene, Belay Bezabih

    2015-01-01

    This study piloted an event-based surveillance system at the health centre (HC) level in Ethiopia. The system collects rumours in the community and registers them in rumour logbooks to record events of disease outbreaks and public health emergencies. Descriptive analysis was conducted on the events captured at the 59 study HCs in the Amhara Region in north-western Ethiopia between October 2013 and November 2014. A total of 126 rumours were registered at two thirds of the HCs during the study period. The average event reporting time was 3.8 days; response time of the HCs was 0.6 days, resulting in a total response time of 4.4 days. The most commonly reported rumours were measles-related (n = 90, 71%). These rumours followed a similar pattern of measles cases reported in the routine surveillance system. The largest proportion of rumours were reported by community members (n = 38, 36%) followed by health post workers (n = 36, 29%) who were normally informed by the community members about the rumours. This surveillance system was established along with an existing indicator-based surveillance system and was simple to implement. The implementation cost was minimal, requiring only printing and distribution of rumour logbooks to the HCs and brief orientations to focal persons. In countries where routine surveillance is still weak, an event-based surveillance system similar to this should be considered as a supplementary tool for disease monitoring. PMID:26668763

  17. Improvement of hydrological flood forecasting through an event based output correction method

    NASA Astrophysics Data System (ADS)

    Klotz, Daniel; Nachtnebel, Hans Peter

    2014-05-01

    This contribution presents an output correction method for hydrological models. A conceptualisation of the method is presented and tested in an alpine basin in Salzburg, Austria. The aim is to develop a method which is not prone to the drawbacks of autoregressive models. Output correction methods are an attractive option for improving hydrological predictions. They are complementary to the main modelling process and do not interfere with the modelling process itself. In general, output correction models estimate the future error of a prediction and use the estimation to improve the given prediction. Different estimation techniques are available dependent on the utilized information and the estimation procedure itself. Autoregressive error models are widely used for such corrections. Autoregressive models with exogenous inputs (ARX) allow the use of additional information for the error modelling, e.g. measurements from upper basins or predicted input-signals. Autoregressive models do however exhibit deficiencies, since the errors of hydrological models do generally not behave in an autoregressive manner. The decay of the error is usually different from an autoregressive function and furthermore the residuals exhibit different patterns under different circumstances. As for an example, one might consider different error-propagation behaviours under high- and low-flow situations or snow melt driven conditions. This contribution presents a conceptualisation of an event-based correction model and focuses on flood events only. The correction model uses information about the history of the residuals and exogenous variables to give an error-estimation. The structure and parameters of the correction models can be adapted to given event classes. An event-class is a set of flood events that exhibit a similar pattern for the residuals or the hydrological conditions. In total, four different event-classes have been identified in this study. Each of them represents a different

  18. Fluence-based and microdosimetric event-based methods for radiation protection in space

    NASA Technical Reports Server (NTRS)

    Curtis, Stanley B.; Meinhold, C. B. (Principal Investigator)

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report #137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/LET method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented.

  19. On-Board Event-Based State Estimation for Trajectory Approaching and Tracking of a Vehicle

    PubMed Central

    Martínez-Rey, Miguel; Espinosa, Felipe; Gardel, Alfredo; Santos, Carlos

    2015-01-01

    For the problem of pose estimation of an autonomous vehicle using networked external sensors, the processing capacity and battery consumption of these sensors, as well as the communication channel load should be optimized. Here, we report an event-based state estimator (EBSE) consisting of an unscented Kalman filter that uses a triggering mechanism based on the estimation error covariance matrix to request measurements from the external sensors. This EBSE generates the events of the estimator module on-board the vehicle and, thus, allows the sensors to remain in stand-by mode until an event is generated. The proposed algorithm requests a measurement every time the estimation distance root mean squared error (DRMS) value, obtained from the estimator's covariance matrix, exceeds a threshold value. This triggering threshold can be adapted to the vehicle's working conditions rendering the estimator even more efficient. An example of the use of the proposed EBSE is given, where the autonomous vehicle must approach and follow a reference trajectory. By making the threshold a function of the distance to the reference location, the estimator can halve the use of the sensors with a negligible deterioration in the performance of the approaching maneuver. PMID:26102489

  20. Event-Based Surveillance During EXPO Milan 2015: Rationale, Tools, Procedures, and Initial Results

    PubMed Central

    Manso, Martina Del; Caporali, Maria Grazia; Napoli, Christian; Linge, Jens P.; Mantica, Eleonora; Verile, Marco; Piatti, Alessandra; Pompa, Maria Grazia; Vellucci, Loredana; Costanzo, Virgilio; Bastiampillai, Anan Judina; Gabrielli, Eugenia; Gramegna, Maria; Declich, Silvia

    2016-01-01

    More than 21 million participants attended EXPO Milan from May to October 2015, making it one of the largest protracted mass gathering events in Europe. Given the expected national and international population movement and health security issues associated with this event, Italy fully implemented, for the first time, an event-based surveillance (EBS) system focusing on naturally occurring infectious diseases and the monitoring of biological agents with potential for intentional release. The system started its pilot phase in March 2015 and was fully operational between April and November 2015. In order to set the specific objectives of the EBS system, and its complementary role to indicator-based surveillance, we defined a list of priority diseases and conditions. This list was designed on the basis of the probability and possible public health impact of infectious disease transmission, existing statutory surveillance systems in place, and any surveillance enhancements during the mass gathering event. This article reports the methodology used to design the EBS system for EXPO Milan and the results of 8 months of surveillance. PMID:27314656

  1. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    PubMed Central

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  2. A Review of Evaluations of Electronic Event-Based Biosurveillance Systems

    PubMed Central

    Gajewski, Kimberly N.; Peterson, Amy E.; Chitale, Rohit A.; Pavlin, Julie A.; Russell, Kevin L.; Chretien, Jean-Paul

    2014-01-01

    Electronic event-based biosurveillance systems (EEBS’s) that use near real-time information from the internet are an increasingly important source of epidemiologic intelligence. However, there has not been a systematic assessment of EEBS evaluations, which could identify key uncertainties about current systems and guide EEBS development to most effectively exploit web-based information for biosurveillance. To conduct this assessment, we searched PubMed and Google Scholar to identify peer-reviewed evaluations of EEBS’s. We included EEBS’s that use publicly available internet information sources, cover events that are relevant to human health, and have global scope. To assess the publications using a common framework, we constructed a list of 17 EEBS attributes from published guidelines for evaluating health surveillance systems. We identified 11 EEBS’s and 20 evaluations of these EEBS’s. The number of published evaluations per EEBS ranged from 1 (Gen-Db, GODsN, MiTAP) to 8 (GPHIN, HealthMap). The median number of evaluation variables assessed per EEBS was 8 (range, 3–15). Ten published evaluations contained quantitative assessments of at least one key variable. No evaluations examined usefulness by identifying specific public health decisions, actions, or outcomes resulting from EEBS outputs. Future EEBS assessments should identify and discuss critical indicators of public health utility, especially the impact of EEBS’s on public health response. PMID:25329886

  3. On-Board Event-Based State Estimation for Trajectory Approaching and Tracking of a Vehicle.

    PubMed

    Martínez-Rey, Miguel; Espinosa, Felipe; Gardel, Alfredo; Santos, Carlos

    2015-01-01

    For the problem of pose estimation of an autonomous vehicle using networked external sensors, the processing capacity and battery consumption of these sensors, as well as the communication channel load should be optimized. Here, we report an event-based state estimator (EBSE) consisting of an unscented Kalman filter that uses a triggering mechanism based on the estimation error covariance matrix to request measurements from the external sensors. This EBSE generates the events of the estimator module on-board the vehicle and, thus, allows the sensors to remain in stand-by mode until an event is generated. The proposed algorithm requests a measurement every time the estimation distance root mean squared error (DRMS) value, obtained from the estimator's covariance matrix, exceeds a threshold value. This triggering threshold can be adapted to the vehicle's working conditions rendering the estimator even more efficient. An example of the use of the proposed EBSE is given, where the autonomous vehicle must approach and follow a reference trajectory. By making the threshold a function of the distance to the reference location, the estimator can halve the use of the sensors with a negligible deterioration in the performance of the approaching maneuver. PMID:26102489

  4. Assessment of Community Event-Based Surveillance for Ebola Virus Disease, Sierra Leone, 2015.

    PubMed

    Ratnayake, Ruwan; Crowe, Samuel J; Jasperse, Joseph; Privette, Grayson; Stone, Erin; Miller, Laura; Hertz, Darren; Fu, Clementine; Maenner, Matthew J; Jambai, Amara; Morgan, Oliver

    2016-08-01

    In 2015, community event-based surveillance (CEBS) was implemented in Sierra Leone to assist with the detection of Ebola virus disease (EVD) cases. We assessed the sensitivity of CEBS for finding EVD cases during a 7-month period, and in a 6-week subanalysis, we assessed the timeliness of reporting cases with no known epidemiologic links at time of detection. Of the 12,126 CEBS reports, 287 (2%) met the suspected case definition, and 16 were confirmed positive. CEBS detected 30% (16/53) of the EVD cases identified during the study period. During the subanalysis, CEBS staff identified 4 of 6 cases with no epidemiologic links. These CEBS-detected cases were identified more rapidly than those detected by the national surveillance system; however, too few cases were detected to determine system timeliness. Although CEBS detected EVD cases, it largely generated false alerts. Future versions of community-based surveillance could improve case detection through increased staff training and community engagement. PMID:27434608

  5. Selection of intense rainfall events based on intensity thresholds and lightning data in Switzerland

    NASA Astrophysics Data System (ADS)

    Gaal, L.; Molnar, P.; Szolgay, J.

    2014-01-01

    This paper presents a method to identify intense warm season storms of convective character based on intensity thresholds and lightning, and analyzes their statistical properties. Long records of precipitation and lightning data at 4 stations and 10 min resolution in different climatological regions in Switzerland are used. Our premise is that thunderstorms associated with lightning generate bursts of high rainfall intensity. We divided all storms into those accompanied by lightning and those without lightning and found the threshold I* that separates intense events based on peak 10 min intensity Ip ≥ I* for a chosen misclassification rate α. The performance and robustness of the selection method was tested by investigating the inter-annual variability of I* and its relation to the frequency of lightning strikes. The probability distributions of the main storm properties (rainfall depth R, event duration D, average storm intensity Ia and peak 10 min intensity Ip) for the intense storm subsets show that the event average and peak intensities are significantly different between the stations, and highest in Lugano in southern Switzerland. Non-parametric correlations between the main storm properties were estimated for the subsets of intense storms and all storms including stratiform rain. The differences in the correlations between storm subsets are greater than those between stations, which indicates that care must be exercised not to mix events when they are sampled for multivariate analysis, e.g. copula fitting to rainfall data.

  6. Selection of intense rainfall events based on intensity thresholds and lightning data in Switzerland

    NASA Astrophysics Data System (ADS)

    Gaál, L.; Molnar, P.; Szolgay, J.

    2014-05-01

    This paper presents a method to identify intense warm season storms with convective character based on intensity thresholds and the presence of lightning, and analyzes their statistical properties. Long records of precipitation and lightning data at 4 stations and 10 min resolution in different climatological regions in Switzerland are used. Our premise is that thunderstorms associated with lightning generate bursts of high rainfall intensity. We divided all recorded storms into those accompanied by lightning and those without lightning and found the threshold I* that separates intense events based on peak 10 min intensity Ip ≥ I* for a chosen misclassification rate α. The performance and robustness of the selection method was tested by investigating the inter-annual variability of I* and its relation to the frequency of lightning strikes. The probability distributions of the main storm properties (rainfall depth R, event duration D, average storm intensity Ia and peak 10 min intensity Ip) for the intense storm subsets show that the event average and peak intensities are significantly different between the stations. Non-parametric correlations between the main storm properties were estimated for intense storms and all storms including stratiform rain. The differences in the correlations between storm subsets are greater than those between stations, which indicates that care must be exercised not to mix events of different origin when they are sampled for multivariate analysis, for example, copula fitting to rainfall data.

  7. Hydrologic Modeling in the Kenai River Watershed using Event Based Calibration

    NASA Astrophysics Data System (ADS)

    Wells, B.; Toniolo, H. A.; Stuefer, S. L.

    2015-12-01

    Understanding hydrologic changes is key for preparing for possible future scenarios. On the Kenai Peninsula in Alaska the yearly salmon runs provide a valuable stimulus to the economy. It is the focus of a large commercial fishing fleet, but also a prime tourist attraction. Modeling of anadromous waters provides a tool that assists in the prediction of future salmon run size. Beaver Creek, in Kenai, Alaska, is a lowlands stream that has been modeled using the Army Corps of Engineers event based modeling package HEC-HMS. With the use of historic precipitation and discharge data, the model was calibrated to observed discharge values. The hydrologic parameters were measured in the field or calculated, while soil parameters were estimated and adjusted during the calibration. With the calibrated parameter for HEC-HMS, discharge estimates can be used by other researches studying the area and help guide communities and officials to make better-educated decisions regarding the changing hydrology in the area and the tied economic drivers.

  8. Event-Based Surveillance During EXPO Milan 2015: Rationale, Tools, Procedures, and Initial Results.

    PubMed

    Riccardo, Flavia; Manso, Martina Del; Caporali, Maria Grazia; Napoli, Christian; Linge, Jens P; Mantica, Eleonora; Verile, Marco; Piatti, Alessandra; Pompa, Maria Grazia; Vellucci, Loredana; Costanzo, Virgilio; Bastiampillai, Anan Judina; Gabrielli, Eugenia; Gramegna, Maria; Declich, Silvia

    2016-01-01

    More than 21 million participants attended EXPO Milan from May to October 2015, making it one of the largest protracted mass gathering events in Europe. Given the expected national and international population movement and health security issues associated with this event, Italy fully implemented, for the first time, an event-based surveillance (EBS) system focusing on naturally occurring infectious diseases and the monitoring of biological agents with potential for intentional release. The system started its pilot phase in March 2015 and was fully operational between April and November 2015. In order to set the specific objectives of the EBS system, and its complementary role to indicator-based surveillance, we defined a list of priority diseases and conditions. This list was designed on the basis of the probability and possible public health impact of infectious disease transmission, existing statutory surveillance systems in place, and any surveillance enhancements during the mass gathering event. This article reports the methodology used to design the EBS system for EXPO Milan and the results of 8 months of surveillance. PMID:27314656

  9. Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens

    SciTech Connect

    Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.; Chung, James S.; Walters, Ronald A.; Arthur, Ray; Bruckner-Lea, Cindy J.; Calapristi, Augustin J.; Dowling, Glenn; Hartley, David M.; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K.; McKenzie, Taylor K.; Nelson, Noele P.; Olsen, Jennifer; Pancerella, Carmen M.; Quitugua, Teresa N.; Reed, Jeremy T.; Thomas, Carla S.

    2012-03-28

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e., aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.

  10. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    PubMed

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component. PMID:23432682

  11. Can readers ignore implausibility? Evidence for nonstrategic monitoring of event-based plausibility in language comprehension.

    PubMed

    Isberner, Maj-Britt; Richter, Tobias

    2013-01-01

    We present evidence for a nonstrategic monitoring of event-based plausibility during language comprehension by showing that readers cannot ignore the implausibility of information even if it is detrimental to the task at hand. In two experiments using a Stroop-like paradigm, participants were required to provide positive and negative responses independent of plausibility in an orthographical task (Experiment 1) or a nonlinguistic color judgment task (Experiment 2) to target words that were either plausible or implausible in their context. We expected a nonstrategic assessment of plausibility to interfere with positive responses to implausible words. ANOVAs and linear mixed models analyses of the response latencies revealed a significant interaction of plausibility and required response that supported this prediction in both experiments, despite the use of two very different tasks. Moreover, it could be shown that the effect was not driven by the differential predictability of plausible and implausible words. These results suggest that plausibility monitoring is an inherent component of information processing. PMID:23165201

  12. An event-based approach for examining the effects of wildland fire decisions on communities.

    PubMed

    McCool, Stephen F; Burchfield, James A; Williams, Daniel R; Carroll, Matthew S

    2006-04-01

    Public concern over the consequences of forest fire to wildland interface communities has led to increased resources devoted to fire suppression, fuel treatment, and management of fire events. The social consequences of the decisions involved in these and other fire-related actions are largely unknown, except in an anecdotal sense, but do occur at a variety of temporal and social organizational scales. These consequences are not limited to the fire event itself. Preparation for the possibility of a fire, actions that suppression agencies take during a fire, and postfire decisions all have consequences, if unknown currently. This article presents an "event-based" approach that can be useful for constructing and systematic discussion about the consequences of wildland fire to human communities. For each of the three major periods within this approach, agencies, communities, and individuals make decisions and take actions that have consequences. The article presents an integrated, temporally based process for examining these consequences, which is similar to others developed in the natural hazards and disaster management literature. PMID:16465562

  13. A coupled model of TiN inclusion growth in GCr15SiMn during solidification in the electroslag remelting process

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Cheng, Guo-guang; Li, Shi-jian; Zhao, Min; Feng, Gui-ping; Li, Tao

    2015-12-01

    TiN inclusions observed in an ingot produced by electroslag remelting (ESR) are extremely harmful to GCr15SiMn steel. Therefore, accurate predictions of the growth size of these inclusions during steel solidification are significant for clean ESR ingot production. On the basis of our previous work, a coupled model of solute microsegregation and TiN inclusion growth during solidification has been established. The results demonstrate that compared to a non-coupled model, the coupled model predictions of the size of TiN inclusions are in good agreement with experimental results using scanning electron microscopy with energy disperse spectroscopy (SEM-EDS). Because of high cooling rate, the sizes of TiN inclusions in the edge area of the ingots are relatively small compared to the sizes in the center area. During the ESR process, controlling the content of Ti in the steel is a feasible and effective method of decreasing the sizes of TiN inclusions.

  14. Advanced Science/Event-based Data Service Framework at GES DISC

    NASA Astrophysics Data System (ADS)

    Shie, C. L.; Shen, S.; Kempler, S. J.

    2014-12-01

    The NASA Goddard Earth Sciences Data and Information Service Center (GES DISC) has provided numerous Earth science data, information, and services to various research communities and general publics for decades. To maintain an overall fine service including improving serving our users with advanced data services has been our primary goal. We are developing an advanced science/event-based data service framework. The framework aims to effectively provide users with a sophisticatedly integrated data package via user-friendly discovering and selecting a system-preset science/event topic (e.g., hurricane, volcano, etc.) from an in-developing knowledge database of the framework. A data recipe page related to the Hurricane topic has been developed to demo the concept. More showcases of various subjects such as Volcano, Dust Storm, and Forest Fire are also under development. This framework is in developing on top of existing data services at GES DISC, such as Mirador (data search engine), Giovanni (visualization), OPeNDAP, and data recipes. It also involves other data tools, such as Panoply, GrADS, IDL, etc. The Hurricane Sandy (Oct 22-31 2012) event is used here for a sample description. As Hurricane Sandy being selected as a user case, a table containing nine system-preset data variables (i.e., precipitation, winds, sea surface temperature, sea level pressure, air temperature, relative humidity, aerosols, soil moisture and surface runoff, and trace gases) linked to the respective data products with fine temporal and spatial resolutions from various in-house sources is provided. The "bundled" variable data can thus be readily downloaded through Mirador. The in-house Giovanni is accessible for users to acquire quick views of Level 3 (gridded) variables. For Level 2 (swath) or the Giovanni-unavailable Level 3 data, the system provides a link to data recipes that give a how-to guide to read and visualize the data using offline tools, such as Panoply, GrADS, or IDL.

  15. A Neuromorphic Event-Based Neural Recording System for Smart Brain-Machine-Interfaces.

    PubMed

    Corradi, Federico; Indiveri, Giacomo

    2015-10-01

    Neural recording systems are a central component of Brain-Machince Interfaces (BMIs). In most of these systems the emphasis is on faithful reproduction and transmission of the recorded signal to remote systems for further processing or data analysis. Here we follow an alternative approach: we propose a neural recording system that can be directly interfaced locally to neuromorphic spiking neural processing circuits for compressing the large amounts of data recorded, carrying out signal processing and neural computation to extract relevant information, and transmitting only the low-bandwidth outcome of the processing to remote computing or actuating modules. The fabricated system includes a low-noise amplifier, a delta-modulator analog-to-digital converter, and a low-power band-pass filter. The bio-amplifier has a programmable gain of 45-54 dB, with a Root Mean Squared (RMS) input-referred noise level of 2.1 μV, and consumes 90 μW . The band-pass filter and delta-modulator circuits include asynchronous handshaking interface logic compatible with event-based communication protocols. We describe the properties of the neural recording circuits, validating them with experimental measurements, and present system-level application examples, by interfacing these circuits to a reconfigurable neuromorphic processor comprising an array of spiking neurons with plastic and dynamic synapses. The pool of neurons within the neuromorphic processor was configured to implement a recurrent neural network, and to process the events generated by the neural recording system in order to carry out pattern recognition. PMID:26513801

  16. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    NASA Astrophysics Data System (ADS)

    Andrés-Doménech, I.; Múnera, J. C.; Francés, F.; Marco, J. B.

    2010-05-01

    Since the Water Framework Directive (WFD) was passed in year 2000, the protection of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs) from urban drainage networks, the WFD implies that CSOs cannot be accepted because of their intrinsic features, but must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, the urban system and the receiving one must be jointly analysed to evaluate their impact. In this context, a coupled scheme is presented in this paper to assess the CSOs impact in a river system in Torrelavega (Spain). First, an urban model is developed to characterise statistically the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess the river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to the hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (the biochemical oxygen demand, BOD, and the total ammonium, NH4+), in the river just after the spills.

  17. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    NASA Astrophysics Data System (ADS)

    Andrés-Doménech, I.; Múnera, J. C.; Francés, F.; Marco, J. B.

    2010-10-01

    Since Water Framework Directive (WFD) was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs) from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain). First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+), within the river just after the spills.

  18. On the Relationship Between Effort Toward an Ongoing Task and Cue Detection in Event-Based Prospective Memory

    ERIC Educational Resources Information Center

    Marsh, Richard L.; Hicks, Jason L.; Cook, Gabriel I.

    2005-01-01

    In recent theories of event-based prospective memory, researchers have debated what degree of resources are necessary to identify a cue as related to a previously established intention. In order to simulate natural variations in attention, the authors manipulated effort toward an ongoing cognitive task in which intention-related cues were embedded…

  19. Modulation of a Fronto-Parietal Network in Event-Based Prospective Memory: An rTMS Study

    ERIC Educational Resources Information Center

    Bisiacchi, P. S.; Cona, G.; Schiff, S.; Basso, D.

    2011-01-01

    Event-based prospective memory (PM) is a multi-component process that requires remembering the delayed execution of an intended action in response to a pre-specified PM cue, while being actively engaged in an ongoing task. Some neuroimaging studies have suggested that both prefrontal and parietal areas are involved in the maintenance and…

  20. Comparison of event-based landslide inventory maps obtained interpreting satellite images and aerial photographs

    NASA Astrophysics Data System (ADS)

    Fiorucci, Federica; Cardinali, Mauro; Carlà Roberto; Mondini, Alessandro; Santurri, Leonardo; Guzzetti, Fausto

    2010-05-01

    Landslide inventory maps are a common type of map used for geomorphological investigations, land planning, and hazard and risk assessment. Landslide inventory maps covering medium to large areas are obtained primarily exploiting traditional geomorphological techniques. These techniques combine the visual and heuristic interpretation of stereoscopic aerial photographs with more or less extensive field investigations. Aerial photographs most commonly used to prepare landslide inventory maps range in scale from about 1:10,000 to about 1:40,000. Interpretation of satellite images is a relatively recent, powerful tool to obtain information of the Earth surface potentially useful for the production of landslide inventory maps. The usefulness of satellite information - and the associated technology - for the identification of landslides and the production of landslide inventory maps, remains largely unexplored. In this context, it is of interest to investigate the type, quantity, and quality of the information that can be retrieved analyzing images taken by the last generation of high and very-high resolution satellite sensors, and to compare this information with the information obtained from the analysis of traditional stereoscopic aerial photographs, or in the field. In the framework of the MORFEO project for the exploitation of Earth Observation data and technology for landslide identification and risk assessment, of the Italian Space Agency, we have compared two event-based landslide inventory maps prepared exploiting two different techniques. The two maps portray the geographical distribution and types of landslides triggered by rainfall in the period from November 2004 to May 2005 in the Collazzone area, Umbria, central Italy. The first map was prepared through reconnaissance field surveys carried out mostly along roads. The second map was obtained through the combined visual interpretation of 1:10,000 scale, colour ortho-photo maps, and images taken by the IKONOS

  1. Event-based approach of downstream Rhône River flood regimes variability since 1982

    NASA Astrophysics Data System (ADS)

    Hénaff, Quentin; Arnaud-Fassetta, Gilles; Beltrando, Gérard

    2015-04-01

    Numerous downstream Rhône River floods have been recorded as catastrophic by French inter-ministerial order since the creation of natural disaster state recognition in 1982. Downstream Rhône River flood regimes, influenced by Mediterranean climate, are fundamentally affected by the spatio-temporal variability of rainfall events, especially in case of widespread flooding. Event-based analysis of cumulative rainfall data should allow us to characterise downstream Rhône River flood regimes variability by applying data mining methods to a spatio-temporal hydro-meteorological database. The first objective of this study is to determine if extreme rainfall events could be considered as geographical events, in other words if rainfall distribution is related to spatial processes. The proposed method is based on the measure of rainfall distribution spatial auto-correlation through the calculation of (i) Global Moran's index and (ii) the significance evaluation of that index with a z-score statistical test and its associated p-value. Secondly, cumulative rainfall data are integrated into a geo-event two-dimensional matrix: (i) cumulative rainfall per sub-catchment in row (spatial base unit) and (ii) cumulative rainfall per catastrophic event in column (temporal base unit). This matrix was co-clustered which allows simultaneous clustering of the rows (sub-catchment) and columns (events) by hierarchical clustering on principal components (HCPC) using Ward's method applying Euclidean Distance as similarity measure. Computing the Global Moran's index demonstrated a spatial aggregation tendency of rainfall distribution and the associated statistical test (z-core and p-value) noted the improbability of statistical evidence of random spatial pattern. Spatial variability of rainfall distribution is the result of two factors: rainfall event structure and rainfall event scale. The co-clustering geo-event matrix provided two co-clustering maps on two different cumulative rainfall

  2. Scalable Parallel Execution of an Event-based Radio Signal Propagation Model for Cluttered 3D Terrains

    SciTech Connect

    Seal, Sudip K; Perumalla, Kalyan S

    2009-01-01

    Radio signal strength estimation is essential in many applications, including the design of military radio communications and industrial wireless installations. While classical approaches such as finite difference methods are well-known, new event-based models of radio signal propagation have been recently shown to deliver such estimates faster (via serial execution) than other methods. For scenarios with large or richly-featured geographical volumes, however, parallel processing is required to meet the memory and computation time demands. Here, we present a scalable and efficient parallel execution of a recently-developed event-based radio signal propagation model. We demonstrate its scalability to thousands of processors, with parallel speedups over 1000x. The speed and scale achieved by our parallel execution enable larger scenarios and faster execution than has ever been reported before.

  3. Incorporating seasonality into event-based joint probability methods for predicting flood frequency: A hybrid causative event approach

    NASA Astrophysics Data System (ADS)

    Li, Jing; Thyer, Mark; Lambert, Martin; Kuzera, George; Metcalfe, Andrew

    2016-02-01

    Flood extremes are driven by highly variable and complex climatic and hydrological processes. Observational evidence has identified that seasonality of climate variables has a major impact on flood peaks. However, event-based joint probability approaches for predicting the flood frequency distribution (FFD), which are commonly used in practice, do not commonly incorporate climate seasonality. This study presents an advance in event-based joint probability approaches by incorporating seasonality using the hybrid causative events (HCE) approach. The HCE was chosen because it uses the true causative events of the floods of interest and is able to combine the accuracy of continuous simulation with the computational efficiency of event-based approaches. The incorporation of seasonality is evaluated using a virtual catchment approach at eight sites over a wide range of Australian climate zones, including tropical, temperature, Mediterranean and desert climates (virtual catchment data for the eight sites is freely available via digital repository). The seasonal HCE provided accurate predictions of the FFD at all sites. In contrast, the non-seasonal HCE significantly over-predicted the FFD at some sites. The need to include seasonality was influenced by the magnitude of the seasonal variation in soil moisture and its coherence with the seasonal variation in extreme rainfall. For sites with a low seasonal variation in soil moisture the non-seasonal HCE provided reliable estimates of the FFD. For the remaining sites, it was found difficult to predict a priori whether ignoring seasonality provided a reliable estimate of the FFD, hence it is recommended that the seasonal HCE always be used. The practical implications of this study are that the HCE approach with seasonality is an accurate and efficient event-based joint probability approach to derive the flood frequency distribution across a wide range of climatologies.

  4. Effective utilization of flue gases in raceway reactor with event-based pH control for microalgae culture.

    PubMed

    Pawlowski, A; Mendoza, J L; Guzmán, J L; Berenguel, M; Acién, F G; Dormido, S

    2014-10-01

    This work addresses effective utilization of flue gases through the proper pH control in raceway reactors. The pH control problem has been addressed with an event-based control approach using a Generalized Predictive Controller (GPC) with actuator deadband. Applying this control strategy it is possible to reduce the control effort, and at the same time saving control resources. In the pH process case, the event-based controller with actuator deadband can be tuned to supply only necessary amount of CO2 to keep the pH close to its optimal value. On the other hand, the evaluated control algorithm significantly improves the pH control accuracy, what has a direct influence on biomass production. In order to test the performance of the event-based GPC controller, several experiments have been performed on a real raceway reactor. Additionally, several control performance indexes have been used to compare the analyzed technique with commonly used on/off controller. PMID:25113401

  5. Breaking the millisecond barrier on SpiNNaker: implementing asynchronous event-based plastic models with microsecond resolution.

    PubMed

    Lagorce, Xavier; Stromatias, Evangelos; Galluppi, Francesco; Plana, Luis A; Liu, Shih-Chii; Furber, Steve B; Benosman, Ryad B

    2015-01-01

    Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms toward those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 μs. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented. PMID:26106288

  6. Breaking the millisecond barrier on SpiNNaker: implementing asynchronous event-based plastic models with microsecond resolution

    PubMed Central

    Lagorce, Xavier; Stromatias, Evangelos; Galluppi, Francesco; Plana, Luis A.; Liu, Shih-Chii; Furber, Steve B.; Benosman, Ryad B.

    2015-01-01

    Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms toward those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 μs. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented. PMID:26106288

  7. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    PubMed Central

    2011-01-01

    Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments

  8. Substance use and other risk factors for unprotected sex: results from an event-based study of homeless youth.

    PubMed

    Tucker, Joan S; Ryan, Gery W; Golinelli, Daniela; Ewing, Brett; Wenzel, Suzanne L; Kennedy, David P; Green, Harold D; Zhou, Annie

    2012-08-01

    This study used an event-based approach to understand condom use in a probability sample of 309 homeless youth recruited from service and street sites in Los Angeles County. Condom use was significantly less likely when hard drug use preceded sex, the relationship was serious, the partners talked about "pulling out", or sex occurred in a non-private place (and marginally less likely when heavier drinking preceded sex, or the partnership was monogamous or abusive). Condom use was significantly more likely when the youth held positive condom attitudes or were concerned about pregnancy, the partners talked about condom use, and the partners met up by chance. This study extends previous work by simultaneously examining a broad range of individual, relationship, and contexual factors that may play a role in condom use. Results identify a number of actionable targets for programs aimed at reducing HIV/STI transmission and pregnancy risk among homeless youth. PMID:21932093

  9. Flexible readout and integration sensor (FRIS): a bio-inspired, system-on-chip, event-based readout architecture

    NASA Astrophysics Data System (ADS)

    Lin, Joseph H.; Pouliquen, Philippe O.; Andreou, Andreas G.; Goldberg, Arnold C.; Rizk, Charbel G.

    2012-06-01

    We present a bio-inspired system-on-chip focal plane readout architecture which at the system level, relies on an event based sampling scheme where only pixels within a programmable range of photon flux rates are output. At the pixel level, a one bit oversampled analog-to-digital converter together with a decimator allows for the quantization of signals up to 26 bits. Furthermore, digital non-uniformity correction of both gain and offset errors is applied at the pixel level prior to readout. We report test results for a prototype array fabricated in a standard 90nm CMOS process. Tests performed at room and cryogenic temperatures demonstrate the capability to operate at a temporal noise ratio as low as 1.5, an electron well capacity over 100Ge-, and an ADC LSB down to 1e-.

  10. Substance Use and Other Risk Factors for Unprotected Sex: Results From An Event-Based Study of Homeless Youth

    PubMed Central

    Tucker, Joan S.; Ryan, Gery W.; Golinelli, Daniela; Munjas, Brett; Wenzel, Suzanne L.; Kennedy, David P.; Green, Harold D.; Zhou, Annie

    2011-01-01

    This study used an event-based approach to understand condom use in a probability sample of 309 homeless youth recruited from service and street sites in Los Angeles County. Condom use was significantly less likely when hard drug use preceded sex, the relationship was serious, the partners talked about “pulling out”, or sex occurred in a non-private place (and marginally less likely when heavier drinking preceded sex, or the partnership was monogamous or abusive). Condom use was significantly more likely when the youth held positive condom attitudes or were concerned about pregnancy, the partners talked about condom use, and the partners met up by chance. This study extends previous work by simultaneously examining a broad range of individual, relationship, and contexual factors that may play a role in condom use. Results identify a number of actionable targets for programs aimed at reducing HIV/STI transmission and pregnancy risk among homeless youth. PMID:21932093

  11. Allowing Brief Delays in Responding Improves Event-Based Prospective Memory for Young Adults Living with HIV Disease

    PubMed Central

    Loft, Shayne; Doyle, Katie L.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Weber, Erica; Blackstone, Kaitlin; Woods, Steven Paul

    2014-01-01

    Event-based prospective memory (PM) tasks require individuals to remember to perform an action when they encounter a specific cue in the environment, and have clear relevance for daily functioning for individuals with HIV. In many everyday tasks, the individual must not only maintain the intent to perform the PM task, but the PM task response also competes with the alternative and more habitual task response. The current study examined whether event-based PM can be improved by slowing down the pace of the task environment. Fifty-seven young adults living with HIV performed an ongoing lexical decision task while simultaneously performing a PM task of monitoring for a specific word (which was focal to the ongoing task of making lexical decisions) or syllable contained in a word (which was nonfocal). Participants were instructed to refrain from making task responses until after a tone was presented, which occurred at varying onsets (0–1600ms) after each stimulus appeared. Improvements in focal and non-focal PM accuracy were observed with response delays of 600ms. Furthermore, the difference in PM accuracy between the low demand focal PM task and the resource demanding non-focal PM task was reduced by half across increasingly longer delays, falling from 31% at 0ms delay to only 14% at 1600ms delay. The degree of ongoing task response slowing for the PM conditions, relative to a control condition that did not have a PM task and made lexical decisions only, also decreased with increased delay. Overall, the evidence indicates that delaying the task responses of younger HIV-infected adults increased the probability that the PM relevant features of task stimuli were adequately assessed prior to the ongoing task response, and by implication that younger HIV infected adults can more adequately achieve PM goals when the pace of the task environment is slowed down. PMID:25116075

  12. Deficits in cue detection underlie event-based prospective memory impairment in major depression: an eye tracking study.

    PubMed

    Chen, Siyi; Zhou, Renlai; Cui, Hong; Chen, Xinyin

    2013-10-30

    This study examined the cue detection in the non-focal event-based prospective memory (PM) of individuals with and without a major depressive disorder using behavioural and eye tracking assessments. The participants were instructed to search on each trial for a different target stimulus that could be present or absent and to make prospective responses to the cue object. PM tasks included cue only and target plus cue, whereas ongoing tasks included target only and distracter only. The results showed that a) participants with depression performed more poorly than those without depression in PM; b) participants with depression showed more fixations and longer total and average fixation durations in both ongoing and PM conditions; c) participants with depression had lower scores on accuracy in target-plus-cue trials than in cue-only trials and had a higher gaze rate of targets on hits and misses in target-plus-cue trials than did those without depression. The results indicate that the state of depression may impair top-down cognitive control function, which in turn results in particular deficits in the engagement of monitoring for PM cues. PMID:23477903

  13. Infectious diseases prioritisation for event-based surveillance at the European Union level for the 2012 Olympic and Paralympic Games.

    PubMed

    Economopoulou, A; Kinross, P; Domanovic, D; Coulombier, D

    2014-01-01

    In 2012, London hosted the Olympic and Paralympic Games (the Games), with events occurring throughout the United Kingdom (UK) between 27 July and 9 September 2012. Public health surveillance was performed by the Health Protection Agency (HPA). Collaboration between the HPA and the European Centre for Disease Prevention and Control (ECDC) was established for the detection and assessment of significant infectious disease events (SIDEs) occurring outside the UK during the time of the Games. Additionally, ECDC undertook an internal prioritisation exercise to facilitate ECDC’s decisions on which SIDEs should have preferentially enhanced monitoring through epidemic intelligence activities for detection and reporting in daily surveillance in the European Union (EU). A team of ECDC experts evaluated potential public health risks to the Games, selecting and prioritising SIDEs for event-based surveillance with regard to their potential for importation to the Games, occurrence during the Games or export to the EU/European Economic Area from the Games. The team opted for a multilevel approach including comprehensive disease selection, development and use of a qualitative matrix scoring system and a Delphi method for disease prioritisation. The experts selected 71 infectious diseases to enter the prioritisation exercise of which 27 were considered as priority for epidemic intelligence activities by ECDC for the EU for the Games. PMID:24762663

  14. Evaluation of the Health Protection Event-Based Surveillance for the London 2012 Olympic and Paralympic Games.

    PubMed

    Severi, E; Kitching, A; Crook, P

    2014-01-01

    The Health Protection Agency (HPA) (currently Public Health England) implemented the Health Protection Event-Based Surveillance (EBS) to provide additional national epidemic intelligence for the 2012 London Olympic and Paralympic Games (the Games). We describe EBS and evaluate the system attributes. EBS aimed at identifying, assessing and reporting to the HPA Olympic Coordination Centre (OCC) possible national infectious disease threats that may significantly impact the Games. EBS reported events in England from 2 July to 12 September 2012. EBS sourced events from reports from local health protection units and from screening an electronic application 'HPZone Dashboard' (DB). During this period, 147 new events were reported to EBS, mostly food-borne and vaccine-preventable diseases: 79 from regional units, 144 from DB (76 from both). EBS reported 61 events to the OCC: 21 of these were reported onwards. EBS sensitivity was 95.2%; positive predictive value was 32.8%; reports were timely (median one day; 10th percentile: 0 days - same day; 90th percentile: 3.6 days); completeness was 99.7%; stability was 100%; EBS simplicity was assessed as good; the daily time per regional or national unit dedicated to EBS was approximately 4 hours (weekdays) and 3 hours (weekends). OCC directors judged EBS as efficient, fast and responsive. EBS provided reliable, reassuring, timely, simple and stable national epidemic intelligence for the Games. PMID:24970374

  15. Event-based minimum-time control of oscillatory neuron models: phase randomization, maximal spike rate increase, and desynchronization.

    PubMed

    Danzl, Per; Hespanha, João; Moehlis, Jeff

    2009-12-01

    We present an event-based feedback control method for randomizing the asymptotic phase of oscillatory neurons. Phase randomization is achieved by driving the neuron's state to its phaseless set, a point at which its phase is undefined and is extremely sensitive to background noise. We consider the biologically relevant case of a fixed magnitude constraint on the stimulus signal, and show how the control objective can be accomplished in minimum time. The control synthesis problem is addressed using the minimum-time-optimal Hamilton-Jacobi-Bellman framework, which is quite general and can be applied to any spiking neuron model in the conductance-based Hodgkin-Huxley formalism. We also use this methodology to compute a feedback control protocol for optimal spike rate increase. This framework provides a straightforward means of visualizing isochrons, without actually calculating them in the traditional way. Finally, we present an extension of the phase randomizing control scheme that is applied at the population level, to a network of globally coupled neurons that are firing in synchrony. The applied control signal desynchronizes the population in a demand-controlled way. PMID:19911192

  16. Event-based washload transport and sedimentation in and around flood bypasses: Case study from the Sacramento Valley, California

    NASA Astrophysics Data System (ADS)

    Singer, M. B.; Aalto, R. A.

    2005-05-01

    In large river systems, suspended sediment transport and deposition patterns are often affected by channel constraints engineered for flood conveyance or navigation. Such managed channels typically have a limited number of overflow loci through which suspended sediment enters the river's floodplain. Engineered flood bypasses are narrow relic floodplains that are supplied by overflow diversion weirs along managed river channels, and support agriculture and complex aquatic and riparian habitats that are sensitive to the delivery of floods, fine sediment, and adsorbed contaminants. They function as wide, shallow conveyance channels parallel to the main river, and therefore present an opportunity to assess the applicability of existing theory for delivery to and settling of suspended sediment within floodplains. This study is an investigation of hydrograph characteristics, sediment delivery, and sedimentation within the upstream reaches of flood bypasses closest to the weir. We present analysis of hydrologic and sediment records and modeling in the Sacramento River basin. The effects of a single large flood in 1964-1965 were analyzed by documenting hydrograph characteristics, computing event-based sediment discharges and reach erosion/deposition through the bypass system, modeling bypass deposition, and comparing modeled results near the weirs with dated sediment cores. The rapidly rising, slowly declining 1964 flood was generated by storm runoff in the Sierra Nevada. The modeling results indicate: washload discharge through the lower valley 0.5 to 1.7 times long-term annual averages; mainstem reach erosion/deposition 0.5 to 1.25 times annual averages; and centimeter scale deposition in flood bypasses. The results are corroborated by a set of sediment cores extracted from Sacramento Valley bypasses, which were dated with 210Pb geochronology and analyzed for grain size. The modeling and data suggest net sediment accumulation between the channel and flood weirs and in

  17. Investigating the variation and non-stationarity in precipitation extremes based on the concept of event-based extreme precipitation

    NASA Astrophysics Data System (ADS)

    She, Dunxian; Shao, Quanxi; Xia, Jun; Taylor, John A.; Zhang, Yongyong; Zhang, Liping; Zhang, Xiang; Zou, Lei

    2015-11-01

    Extreme precipitations (EP) could induce a series of social, environmental and ecological problems. Traditional EP analysis usually investigated the characteristics based on a fixed time scale and therefore ignored the continuity of EP occurrence. As a result, a comprehensive assessment on the influence and consequence of the EP occurring during consecutive time periods were largely eliminated. On the other hand, the characteristics of EP, including variables such as frequency, intensity and extreme volume, were commonly defined without sufficient consideration of the local tolerance capacity (which can be represented by a threshold level of EP) and therefore would sometimes be inappropriate. In this study, we proposed a concept of event-based extreme precipitation (EEP) by considering the continuity of EP and defined the statistical variables for the characteristics of an EEP event by taking account of local tolerance capacity. An EEP was identified as a collection of precipitation data over the consecutive time period in which all the precipitation amounts are above the pre-defined threshold, and EEP events are separated by at least one time step (e.g., day or hour) with precipitation amount below the threshold. As a case study which in fact motivated our proposal, we investigated the changes and variations of EEP with the consideration of potential non-stationarity in the Hanjiang River Basin of China (HJRB) during the time period of 1960-2013. Results showed that the concept of EEP, which could reflect the impact of continuity of EP occurrence and mirror the differences of local tolerance capacity, was more appropriate than the traditional method in the EP analysis.

  18. Continuous and event-based time series analysis of observed floodplain groundwater flow under contrasting land-use types.

    PubMed

    Kellner, Elliott; Hubbart, Jason A

    2016-10-01

    There is an ongoing need to improve quantitative understanding of land-use impacts on floodplain groundwater flow regimes. A study was implemented in Hinkson Creek Watershed, Missouri, USA, including equidistant grids of nine piezometers, equipped with pressure transducers, which were installed at two floodplain study sites: a remnant bottomland hardwood forest (BHF) and a historical agricultural field (Ag). Data were logged at thirty minute intervals for the duration of the 2011, 2012, 2013, and 2014 water years (October 1, 2010-September 30, 2014). Results show significant (p<0.001) differences between Darcy-estimated groundwater flow at the two study sites. Although median flow values at the two sites were similar (0.009 and 0.010mday(-1) for the Ag and BHF sites, respectively), the BHF displayed a more dynamic flow regime compared to the Ag site. Maximum flow values were 0.020 and 0.049mday(-1) for the Ag and BHF sites, respectively. Minimum flow values were -0.018 and -0.029mday(-1) for the Ag and BHF sites, respectively. The BHF showed greater magnitude, longer duration, and more frequent negative flows, relative to the Ag site. Event-based analyses indicated a more seasonally responsive flow regime at the BHF, with higher flows than the Ag site during the wet season and lower flows than the Ag site during the dry season. Notably, the seasonal pattern of relative site flow differences was consistent across a wide range of precipitation event magnitudes (i.e. 8-45mm). Results are by majority attributable to greater rates of plant water use by woody vegetation and preferential subsurface flow at the BHF site. Collectively, results suggest greater flood attenuation capacity and streamwater buffering potential by the BHF floodplain, relative to the Ag, and highlight the value of floodplain forests as a land and water resource management tool. PMID:27232970

  19. Assessing the impact of utilizing event based water erosion versus static average water erosion (RUSLE) in carbon turnover modeling

    NASA Astrophysics Data System (ADS)

    Wilken, Florian; Fiener, Peter

    2015-04-01

    The discrepancy between the time scales at which soil redistribution processes and SOC turnover occur is an unresolved issue in erosion related carbon turnover modeling. The use of a static average erosion rate (e.g. revised universal soil loss equation; RUSLE) ignores event dynamic processes of (i) SOC enrichment during erosion, transport and deposition, (ii) event specific C release to the atmosphere during erosion processes, and (iii) event specific depth of SOC burial,. We hypothesize that event driven SOC enrichment and SOC burial is of fundamental importance for inter-annual carbon turnover. The study was carried out in an arable watershed (3.7 ha) with no-till management located in the loess dominated Tertiary hills 40 km north of Munich, Germany. To assess the importance of event dynamic SOC redistribution processes, we implemented two different water erosion modelling approaches in the coupled erosion and turnover model SPEROS-C. The first, RUSLE-based approach as already implemented in SPEROS-C, represents long-term mean erosion, while the second is based on the high-resolution, event-based and especially sediment size class selective Multi-Class Sediment Transport model (MCST). In both cases bulk sediment delivery and in case of MCST sediment size specific sediment delivery are tested and partly calibrated against an eight year monitoring data set. First results indicate that especially SOC enrichment during erosion, transport and deposition should be included in estimates of soil redistribution processes upon watershed C balances. The modelling with MCST also indicates that interpreting SOC patterns in eroding landscapes might be also biased if the selective nature of SOC erosion and deposition is ignored.

  20. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    ERIC Educational Resources Information Center

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  1. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    ERIC Educational Resources Information Center

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  2. An event-based approach to understanding the hydrological impacts of different land uses in semi-arid catchments

    NASA Astrophysics Data System (ADS)

    Wang, Shengping; Zhang, Zhiqiang; McVicar, Tim R.; Zhang, Jianjun; Zhu, Jinzhao; Guo, Junting

    2012-01-01

    SummaryIn semi-arid catchments around the world re-vegetation is often implemented to reduce quick surface runoff, combat severe soil erosion, restore degraded ecosystem functionality, and, ultimately, improve ecosystem productivity. However, to date, in these water stressed regions, the event-based hydrological impact of different land uses induced by re-vegetation activities is not fully understood at the watershed scale. Traditional hillslope plot experiments and paired watershed experiments have proved difficult to up-scale to a watershed level. In 2006 and 2007, we used broad-crested weirs to measure event streamflow from six catchments within the Caijiachuan watershed (area = 40.1 km 2), located in the Loess Plateau, a semi-arid region in China. The six catchments have different land use compositions with functional combinations of crop, grassland, shrubland, secondary forest, and plantations. Over the same period, event rainfall was measured by a network of rainfall gauges deployed over the study site. We examined the difference in hydrological properties between the catchments using the non-parametric Firedman test, and differentiated the role of each land use in governing watershed hydrology using a numerical analysis technique. Our results showed important differences between the six catchments with respect to event runoff coefficients, normalized peak flow, and event duration. Each land use played a different role in catchment hydrology, as shown by the different mean runoff coefficients ( rc) and mean representative surface flow velocities ( V). Compared to secondary forest ( rc = 0.017 and V = 0.07 m s -1), plantations ( rc = 0.001 and V = 0.18 m s -1) provide a greater potential for increasing shearing force and had a larger impact on runoff reduction. Although shrubland ( rc = 0.096 and V = 0.20 m s -1) and grassland ( rc = 0.127 and V = 0.02 m s -1) have similar magnitude of mean runoff coefficients, grassland seems more capable of trapping sediment

  3. Time-based and event-based prospective memory in autism spectrum disorder: the roles of executive function and theory of mind, and time-estimation.

    PubMed

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-07-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21 intellectually high-functioning children with ASD, and 21 age- and IQ-matched neurotypical comparison children. We found impaired time-based, but undiminished event-based, prospective memory among children with ASD. In the ASD group, time-based prospective memory performance was associated significantly with diminished theory of mind, but not with diminished cognitive flexibility. There was no evidence that time-estimation ability contributed to time-based prospective memory impairment in ASD. PMID:23179340

  4. Influence of intra-event-based flood regime on sediment flow behavior from a typical agro-catchment of the Chinese Loess Plateau

    NASA Astrophysics Data System (ADS)

    Zhang, Le-Tao; Li, Zhan-Bin; Wang, He; Xiao, Jun-Bo

    2016-07-01

    The pluvial erosion process is significantly affected by tempo-spatial patterns of flood flows. However, despite their importance, only a few studies have investigated the sediment flow behavior that is driven by different flood regimes. The study aims to investigate the effect of intra-event-based flood regimes on the dynamics of sediment exports at Tuanshangou catchment, a typical agricultural catchment (unmanaged) in the hilly loess region on the Chinese Loess Plateau. Measurements of 193 flood events and 158 sediment-producing events were collected from Tuanshangou station between 1961 and 1969. The combined methods of hierarchical clustering approach, discriminant analysis and One-Way ANOVA were used to classify the flood events in terms of their event-based flood characteristics, including flood duration, peak discharge, and event flood runoff depth. The 193 flood events were classified into five regimes, and the mean statistical features of each regime significantly differed. Regime A includes flood events with the shortest duration (76 min), minimum flood crest (0.045 m s-1), least runoff depth (0.2 mm), and highest frequency. Regime B includes flood events with a medium duration (274 min), medium flood crest (0.206 m s-1), and minor runoff depth (0.7 mm). Regime C includes flood events with the longest duration (822 min), medium flood crest (0.236 m s-1), and medium runoff depth (1.7 mm). Regime D includes flood events with a medium duration (239 min), large flood crest (4.21 m s-1), and large runoff depth (10 mm). Regime E includes flood events with a medium duration (304 min), maximum flood crest (8.62 m s-1), and largest runoff depth (25.9 mm). The sediment yield by different flood regimes is ranked as follows: Regime E > Regime D > Regime B > Regime C > Regime A. In terms of event-based average and maximum suspended sediment concentration, these regimes are ordered as follows: Regime E > Regime D > Regime C > Regime B > Regime A. Regimes D and E

  5. Issues for Simulation of Galactic Cosmic Ray Exposures for Radiobiological Research at Ground-Based Accelerators

    PubMed Central

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.

    2015-01-01

    For radiobiology research on the health risks of galactic cosmic rays (GCR) ground-based accelerators have been used with mono-energetic beams of single high charge, Z and energy, E (HZE) particles. In this paper, we consider the pros and cons of a GCR reference field at a particle accelerator. At the NASA Space Radiation Laboratory (NSRL), we have proposed a GCR simulator, which implements a new rapid switching mode and higher energy beam extraction to 1.5 GeV/u, in order to integrate multiple ions into a single simulation within hours or longer for chronic exposures. After considering the GCR environment and energy limitations of NSRL, we performed extensive simulation studies using the stochastic transport code, GERMcode (GCR Event Risk Model) to define a GCR reference field using 9 HZE particle beam–energy combinations each with a unique absorber thickness to provide fragmentation and 10 or more energies of proton and 4He beams. The reference field is shown to well represent the charge dependence of GCR dose in several energy bins behind shielding compared to a simulated GCR environment. However, a more significant challenge for space radiobiology research is to consider chronic GCR exposure of up to 3 years in relation to simulations with animal models of human risks. We discuss issues in approaches to map important biological time scales in experimental models using ground-based simulation, with extended exposure of up to a few weeks using chronic or fractionation exposures. A kinetics model of HZE particle hit probabilities suggests that experimental simulations of several weeks will be needed to avoid high fluence rate artifacts, which places limitations on the experiments to be performed. Ultimately risk estimates are limited by theoretical understanding, and focus on improving knowledge of mechanisms and development of experimental models to improve this understanding should remain the highest priority for space radiobiology research. PMID:26090339

  6. Issues for Simulation of Galactic Cosmic Ray Exposures for Radiobiological Research at Ground-Based Accelerators.

    PubMed

    Kim, Myung-Hee Y; Rusek, Adam; Cucinotta, Francis A

    2015-01-01

    For radiobiology research on the health risks of galactic cosmic rays (GCR) ground-based accelerators have been used with mono-energetic beams of single high charge, Z and energy, E (HZE) particles. In this paper, we consider the pros and cons of a GCR reference field at a particle accelerator. At the NASA Space Radiation Laboratory (NSRL), we have proposed a GCR simulator, which implements a new rapid switching mode and higher energy beam extraction to 1.5 GeV/u, in order to integrate multiple ions into a single simulation within hours or longer for chronic exposures. After considering the GCR environment and energy limitations of NSRL, we performed extensive simulation studies using the stochastic transport code, GERMcode (GCR Event Risk Model) to define a GCR reference field using 9 HZE particle beam-energy combinations each with a unique absorber thickness to provide fragmentation and 10 or more energies of proton and (4)He beams. The reference field is shown to well represent the charge dependence of GCR dose in several energy bins behind shielding compared to a simulated GCR environment. However, a more significant challenge for space radiobiology research is to consider chronic GCR exposure of up to 3 years in relation to simulations with animal models of human risks. We discuss issues in approaches to map important biological time scales in experimental models using ground-based simulation, with extended exposure of up to a few weeks using chronic or fractionation exposures. A kinetics model of HZE particle hit probabilities suggests that experimental simulations of several weeks will be needed to avoid high fluence rate artifacts, which places limitations on the experiments to be performed. Ultimately risk estimates are limited by theoretical understanding, and focus on improving knowledge of mechanisms and development of experimental models to improve this understanding should remain the highest priority for space radiobiology research. PMID:26090339

  7. Event-Based Runoff Across Changing Land Covers in the Panama Canal Watershed: A Synthesis of Hydrophysical Measurements and Hydrochemical Tracers Using Hydrograph Separation

    NASA Astrophysics Data System (ADS)

    Litt, G.; Gardner, C.; Ogden, F. L.; Lyons, W. B.

    2014-12-01

    Tropical hydrology is understudied relative to its temperate counterparts and thus presents challenges for understanding catchment runoff behavior undergoing land use change. Combining hydrometric and hydrochemical observations can shed light on potential differences in runoff processes under changing land covers. We compare event-based dual member hydrograph separations across humid tropical lowland forest (142 ha), mixed land use (176 ha) and pasture (36 ha) catchments following two years of monitoring during the seasonal dry to wet season transition. Stable water isotope and electrical conductivity tracer event water fraction estimations agree well during small runoff events, but exhibit different results during a large runoff event with a greater runoff coefficient. Geochemical tracers exhibit event water fraction maximums during hydrograph recessions and a seasonal transition in runoff behavior among all land uses. From these results we identify potential runoff mechanisms in these steep humid tropical catchments under varying land uses.

  8. Preserved Intention Maintenance and Impaired Execution of Prospective Memory Responses in Schizophrenia: Evidence from an Event-based Prospective Memory Study

    PubMed Central

    Demeter, Gyula; Szendi, István; Domján, Nóra; Juhász, Marianna; Greminger, Nóra; Szőllősi, Ágnes; Racsmány, Mihály

    2016-01-01

    Executive system dysfunction and impaired prospective memory (PM) are widely documented in schizophrenia. However, it is not yet clarified which components of PM function are impaired in this disorder. Two plausible target components are the maintenance of delayed intentions and the execution of PM responses. Furthermore, it is debated whether the impaired performance on frequently used executive tasks is associated with deficit in PM functions. The aim of our study was twofold. First, we aimed to investigate the specific processes involved in event-based PM function, mainly focusing on difference between maintenance of intention and execution of PM responses. Second, we aimed to unfold the possible connections between executive functions, clinical symptoms, and PM performance. An event-based PM paradigm was applied with three main conditions: baseline (with no expectation of PM stimuli, and without PM stimuli), expectation condition (participants were told that PM stimuli might occur, though none actually did), and execution condition (participants were told that PM stimuli might occur, and PM stimuli did occur). This procedure allowed us to separately investigate performances associated with intention maintenance and execution of PM responses. We assessed working memory and set-shifting executive functions by memory span tasks and by the Wisconsin Card Sorting Test (WCST), respectively. Twenty patients diagnosed with schizophrenia and 20 healthy control subjects (matched according to age and education) took part in the study. It was hypothesized that patients would manifest different levels of performance in the expectation and execution conditions of the PM task. Our results confirmed that the difference between baseline performance and performance in the execution condition (execution cost) was significantly larger for participants diagnosed with schizophrenia in comparison with matched healthy control group. However, this difference was not observed in the

  9. Task Importance Affects Event-based Prospective Memory Performance in Adults with HIV-Associated Neurocognitive Disorders and HIV-infected Young Adults with Problematic Substance Use

    PubMed Central

    Woods, Steven Paul; Doyle, Katie L.; Morgan, Erin E.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Loft, Shayne

    2014-01-01

    Objective Two experiments were conducted to examine the effects of task importance on event-based prospective memory (PM) in separate samples of adults with HIV-associated Neurocognitive Disorders (HAND) and HIV-infected young adults with Substance Use Disorders (SUD). Method All participants completed three conditions of an ongoing lexical decision task: 1) without PM task requirements; 2) with PM task requirements that emphasized the importance of the ongoing task; and 3) with PM task requirements that emphasized the importance of the PM task. Results In both experiments, all HIV+ groups showed the expected increase in response costs to the ongoing task when the PM task’s importance was emphasized. In Experiment 1, individuals with HAND showed significantly lower PM accuracy as compared to HIV+ subjects without HAND when the importance of the ongoing task was emphasized, but improved significantly and no longer differed from HIV+ subjects without HAND when the PM task was emphasized. A similar pattern of findings emerged in Experiment 2, whereby HIV+ young adults with SUD (especially cannabis) showed significant improvements in PM accuracy when the PM task was emphasized. Conclusions Findings suggest that both HAND and SUD may increase the amount of cognitive attentional resources that need to be allocated to support PM performance in persons living with HIV infection. PMID:24834469

  10. Comment of "Event-based soil loss models for construction sites" by Trenouth and Gharabaghi, J. Hydrol. doi: 10.1016/jhydrol.2015.03.010

    NASA Astrophysics Data System (ADS)

    Kinnell, P. I. A.

    2015-09-01

    Trenouth and Gharabaghi (2015) present two models which replace the EI30 index used as the event erosivity index in the USLE/RUSLE with ones that include runoff and values of EI30 to powers that differ for 1.0 as the event erosivity factor in modelling soil loss for construction sites. Their analysis on the application of these models focused on data from 5 locations as a whole but did not show how the models worked at each location. Practically, the ability to predict sediment yields at a specific location is more relevant than the capacity of a model to predict sediment yields globally. Also, the mathematical structure of their proposed models shows little regard to the physical processes involved in causing erosion and sediment yield. There is still the need to develop event-based empirical models for construction sites that are robust because they give proper consideration to the erosion process involved, and take account of the fact that sediment yield is usually determined from measurements of suspended load whereas soil loss at the scale for which the USLE/RUSLE model was developed includes both suspended load and bed load.

  11. Comparison of event-based analysis of glaucoma progression assessed subjectively on visual fields and retinal nerve fibre layer attenuation measured by optical coherence tomography.

    PubMed

    Kaushik, Sushmita; Mulkutkar, Samyak; Pandav, Surinder Singh; Verma, Neelam; Gupta, Amod

    2015-02-01

    The purpose is to study the ability of an event-based analysis of retinal nerve fibre layer (RNFL) attenuation measured by Stratus(®) optical coherence tomography (OCT) and to detect progression across the spectrum of glaucoma. Adult glaucoma suspects, ocular hypertensives and glaucoma patients who had undergone baseline RNFL thickness measurement on Stratus OCT and reliable automated visual field examination by Humphrey's visual field analyser prior to March 2007 and had 5-year follow-up data were recruited. Progression on OCT was defined by two criteria: decrease in average RNFL thickness from baseline by at least 10 and 20 µ. Visual field progression was defined by the modified Hodapp-Parrish-Anderson criteria. Absolute and percentage change in RNFL thickness from baseline was compared in progressors and non-progressors on visual fields. Concordance between structural and functional progression was analysed. 318 eyes of 162 patients were analysed. 35 eyes (11 %) progressed by visual fields, 8 (2.5 %) progressed using the 20 µ loss criterion, while 30 eyes (9.4 %) progressed using the 10 µ loss criterion. In glaucoma suspects, mean absolute RNFL attenuation was 8.6 µ (12.1 % of baseline) in those who progressed to glaucoma by visual fields. OCT was more useful to detect progression in early glaucoma, but performed poorly in advanced glaucoma. The 10 µ criterion appears to be closer to visual field progression. However, the ability to detect progression varies considerably between functional and structural tools depending upon the severity of the disease. PMID:25502985

  12. Developing a Framework for Testing Distributed Hydrologic Models at the Catchment Scale - Examples of Test Model Runs for Event Based and Long Term Simulations

    NASA Astrophysics Data System (ADS)

    Cristea, N. C.; Kampf, S. K.; Mirus, B. B.; Loague, K.; Burges, S. J.

    2008-12-01

    We develop a testing framework for distributed hydrologic models at the catchment scale using the hypothetical reality concept. The hypothetical reality, considered an error-free hydrologic response modeled after the 10.5-ha Tarrawarra catchment in Australia, is generated using the sophisticated Integrated Hydrology Model (InHM) representing fully coupled 3D variably saturated subsurface and 2D surface flow with finite-element discretization. The hypothetical realty consists in a data set composed of two subsets designed to test long-term and event-based behavior of the test model. The first subset, a long term data set, comprises an 11-year variable time step time series of hydrograph output and head and saturation levels at 55 observation nodes as well as daily snapshots of head and saturation levels at all nodes in the domain. The second subset, short term data set, has the same type of output, but the domain snapshots are at a much finer time scale, every half hour for the selected rain events. Both data sets were obtained with the same InHM configuration but different output time steps. We use MODHMS (HydroGeoLogic, Inc.), a MODFLOW-based code that solves Richards equation for the 3-D variably saturated subsurface flow and the diffuse wave approximation 2D overland flow with finite differences in a coupled approach as the test model. We present examples of model testing scenarios with variations in spatial discretization, initial conditions, and representation of hydrologic processes. Future work includes testing of model parameters and soil characteristics, with the test model running first without any calibration and then with calibration against the hypothetical reality.

  13. Flood-event based metal distribution patterns in water as approach for source apportionment of pollution on catchment scale: Examples from the River Elbe

    NASA Astrophysics Data System (ADS)

    Baborowski, Martina; Einax, Jürgen W.

    2016-04-01

    With the implementation of European Water Frame Work Directive (EU-WFD), the pollution sources in the River Elbe were assessed by the River Basin Community Elbe (RBC Elbe). Contaminated old sediments played the most significant role for inorganic and organic pollution. In terms of further improvement of the water quality in the river system, a prioritization of the known pollution sources is necessary, with respect to the expected effect in the case of their remediation. This requires information on mobility of contaminated sediments. To create a tool that allows the assessment of pollution trends in the catchment area, event based flood investigations were carried out at a sampling site in the Middle Elbe. The investigations were based on a comparable, discharge related sampling strategy. Four campaigns were performed between 1995 and 2006. The majority of the investigated 16 elements (>80%) studied more intensively in 2006 reached its maximum concentration during the first five days of the event. Only the concentrations of B, Cl-, and U declined with increasing discharge during the flood. The aim of the study was to verify that each flood event is characterized by an internal structure of the water quality. This structure is formed by the appearance of maximum values of water quality parameters at different times during the event. It could be detected by descriptive and multivariate statistical methods. As a result, internal structure of the water quality during the flood was influenced primarily by the source of the metals in the catchment area and its distance from the sampling point. The transport of metals in dissolved, colloidal or particulate form and changes of their ratios during the flood were however, not decisive for the formation of the structure. Our results show that the comparison of the structures obtained from events in different years is indicative of the pollution trend in the catchment area. Exemplarily the trend of the metal pollution in the

  14. Modeller subjectivity and calibration impacts on hydrological model applications: an event-based comparison for a road-adjacent catchment in south-east Norway.

    PubMed

    Kalantari, Zahra; Lyon, Steve W; Jansson, Per-Erik; Stolte, Jannes; French, Helen K; Folkeson, Lennart; Sassner, Mona

    2015-01-01

    Identifying a 'best' performing hydrologic model in a practical sense is difficult due to the potential influences of modeller subjectivity on, for example, calibration procedure and parameter selection. This is especially true for model applications at the event scale where the prevailing catchment conditions can have a strong impact on apparent model performance and suitability. In this study, two lumped models (CoupModel and HBV) and two physically-based distributed models (LISEM and MIKE SHE) were applied to a small catchment upstream of a road in south-eastern Norway. All models were calibrated to a single event representing typical winter conditions in the region and then applied to various other winter events to investigate the potential impact of calibration period and methodology on model performance. Peak flow and event-based hydrographs were simulated differently by all models leading to differences in apparent model performance under this application. In this case-study, the lumped models appeared to be better suited for hydrological events that differed from the calibration event (i.e., events when runoff was generated from rain on non-frozen soils rather than from rain and snowmelt on frozen soil) while the more physical-based approaches appeared better suited during snowmelt and frozen soil conditions more consistent with the event-specific calibration. This was due to the combination of variations in subsurface conditions over the eight events considered, the subsequent ability of the models to represent the impact of the conditions (particularly when subsurface conditions varied greatly from the calibration event), and the different approaches adopted to calibrate the models. These results indicate that hydrologic models may not only need to be selected on a case-by-case basis but also have their performance evaluated on an application-by-application basis since how a model is applied can be equally important as inherent model structure. PMID

  15. Event-based Recession Analysis across Scales

    NASA Astrophysics Data System (ADS)

    Chen, B.; Krajewski, W. F.

    2012-12-01

    Hydrograph recessions have long been a window to investigate hydrological processes and their interactions. The authors conducted an exploratory analysis of about 1000 individual hydrograph recessions in a period of around 15 years (1995-2010) from time series of hourly discharge (USGS IDA stream flow data set) at 27 USGS gauges located in Iowa and Cedar River basins with drainage area ranging from 6.7 to around 17000 km2. They calculated recession exponents with the same recession length but different time lags from the hydrograph peak ranging from ~0 to 96 hours, and then plotted them against time lags to construct the evolution of recession exponent. The result shows that, as recession continues, the recession exponent in first increases quickly, then decreases quickly, and finally stays constant. Occasionally and for different reasons, the decreasing portion is missing due to negligible contribution from soil water storage. The increasing part of the evolution of can be related to fast response to rainfall including overland flow and quick subsurface flow through macropores (or tiles), and the decreasing portion can be connected to the delayed soil water response. Lastly, the constant segment can be attributed to the groundwater storage with the slowest response. The points where recession exponent reaches its maximum and begins to plateau are the times that fast response and soil water response end, respectively. The authors conducted further theoretical analysis by combining mathematical derivation and literature results to explain the observed evolution path of the recession exponent . Their results have a direct application in hydrograph separation and important implications for dynamic basin storage-discharge relation analysis and hydrological process understanding across scales.

  16. GCR-Induced Photon Luminescence of the Moon

    NASA Technical Reports Server (NTRS)

    Lee, K. T.; Wilson, T. L.

    2008-01-01

    It is shown that the Moon has a ubiquitous photon luminescence induced by Galactic cosmic-rays (GCRs), using the Monte Carlo particle-physics program FLUKA. Both the fluence and the flux of the radiation can be determined by this method, but only the fluence will be presented here. This is in addition to thermal radiation emitted due to the Moon s internal temperature and radioactivity. This study is a follow-up to an earlier discussion [1] that addressed several misconceptions regarding Moonshine in the Earth-Moon system (Figure 1) and predicted this effect. There also exists a related x-ray fluorescence induced by solar energetic particles (SEPs, <350 MeV) and solar photons at lower x-ray energies, although this latter fluorescence was studied on Apollo 15 and 16 [2- 5], Lunar Prospector [6], and even EGRET [7].

  17. Production of neutrons from interactions of GCR-like particles

    NASA Technical Reports Server (NTRS)

    Heilbronn, L.; Frankel, K.; Holabird, K.; Zeitlin, C.; McMahan, M. A.; Rathbun, W.; Cronqvist, M.; Gong, W.; Madey, R.; Htun, M.; Elaasar, M.; Anderson, B. D.; Baldwin, A. R.; Jiang, J.; Keane, D.; Scott, A.; Shao, Y.; Watson, J. W.; Zhang, W. M.; Galonsky, A.; Ronningen, R.; Zecher, P.; Kruse, J.; Wang, J.; Miller, J. (Principal Investigator)

    1998-01-01

    In order to help assess the risk to astronauts due to the long-term exposure to the natural radiation environment in space, an understanding of how the primary radiation field is changed when passing through shielding and tissue materials must be obtained. One important aspect of the change in the primary radiation field after passing through shielding materials is the production of secondary particles from the breakup of the primary. Neutrons are an important component of the secondary particle field due to their relatively high biological weighting factors, and due to their relative abundance, especially behind thick shielding scenarios. Because of the complexity of the problem, the estimation of the risk from exposure to the secondary neutron field must be handled using calculational techniques. However, those calculations will need an extensive set of neutron cross section and thicktarget neutron yield data in order to make an accurate assessment of the risk. In this paper we briefly survey the existing neutron-production data sets that are applicable to the space radiation transport problem, and we point out how neutron production from protons is different than neutron production from heavy ions. We also make comparisons of one the heavy-ion data sets with Boltzmann-Uehling-Uhlenbeck (BUU) calculations.

  18. Characterizing intra and inter annual variability of storm events based on very high frequency monitoring of hydrological and chemical variables: what can we learn about hot spots and hot moments from continuous hydro-chemical sensors ?

    NASA Astrophysics Data System (ADS)

    Fovet, O.; Thelusma, G.; Humbert, G.; Dupas, R.; Jaffrezic, A.; Grimaldi, C.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Gruau, G.

    2015-12-01

    storm events based on the descriptors.

  19. Event-based stormwater management pond runoff temperature model

    NASA Astrophysics Data System (ADS)

    Sabouri, F.; Gharabaghi, B.; Sattar, A. M. A.; Thompson, A. M.

    2016-09-01

    Stormwater management wet ponds are generally very shallow and hence can significantly increase (about 5.4 °C on average in this study) runoff temperatures in summer months, which adversely affects receiving urban stream ecosystems. This study uses gene expression programming (GEP) and artificial neural networks (ANN) modeling techniques to advance our knowledge of the key factors governing thermal enrichment effects of stormwater ponds. The models developed in this study build upon and compliment the ANN model developed by Sabouri et al. (2013) that predicts the catchment event mean runoff temperature entering the pond as a function of event climatic and catchment characteristic parameters. The key factors that control pond outlet runoff temperature, include: (1) Upland Catchment Parameters (catchment drainage area and event mean runoff temperature inflow to the pond); (2) Climatic Parameters (rainfall depth, event mean air temperature, and pond initial water temperature); and (3) Pond Design Parameters (pond length-to-width ratio, pond surface area, pond average depth, and pond outlet depth). We used monitoring data for three summers from 2009 to 2011 in four stormwater management ponds, located in the cities of Guelph and Kitchener, Ontario, Canada to develop the models. The prediction uncertainties of the developed ANN and GEP models for the case study sites are around 0.4% and 1.7% of the median value. Sensitivity analysis of the trained models indicates that the thermal enrichment of the pond outlet runoff is inversely proportional to pond length-to-width ratio, pond outlet depth, and directly proportional to event runoff volume, event mean pond inflow runoff temperature, and pond initial water temperature.

  20. Address-event-based platform for bioinspired spiking systems

    NASA Astrophysics Data System (ADS)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA to allow the platform to implement eventbased algorithms to interact to the AER system, like control algorithms, network connectivity, USB support, etc. The LVDS transceiver allows a bandwidth of up to 1.32 Gbps, around ~66 Mega events per second (Mevps).

  1. Event-based total suspended sediment particle size distribution model

    NASA Astrophysics Data System (ADS)

    Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.

    2016-05-01

    One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.

  2. Event-based text mining for biology and functional genomics.

    PubMed

    Ananiadou, Sophia; Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B

    2015-05-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of 'events', i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  3. An event-based architecture for solving constraint satisfaction problems

    NASA Astrophysics Data System (ADS)

    Mostafa, Hesham; Müller, Lorenz K.; Indiveri, Giacomo

    2015-12-01

    Constraint satisfaction problems are ubiquitous in many domains. They are typically solved using conventional digital computing architectures that do not reflect the distributed nature of many of these problems, and are thus ill-suited for solving them. Here we present a parallel analogue/digital hardware architecture specifically designed to solve such problems. We cast constraint satisfaction problems as networks of stereotyped nodes that communicate using digital pulses, or events. Each node contains an oscillator implemented using analogue circuits. The non-repeating phase relations among the oscillators drive the exploration of the solution space. We show that this hardware architecture can yield state-of-the-art performance on random SAT problems under reasonable assumptions on the implementation. We present measurements from a prototype electronic chip to demonstrate that a physical implementation of the proposed architecture is robust to practical non-idealities and to validate the theory proposed.

  4. MCD for detection of event-based landslides

    NASA Astrophysics Data System (ADS)

    Mondini, A. C.; Chang, K.; Guzzetti, F.

    2011-12-01

    Landslides play an important role in the landscape evolution of mountainous terrain. They also present a socioeconomic problem in terms of risk for people and properties. Landslide inventory maps are not available for many areas affected by slope instabilities, resulting in a lack of primary information for the comprehension of the phenomenon, evaluation of relative landslide statistics, and civil protection operations on large scales. Traditional methods for the preparation of landslide inventory maps are based on the geomorphological interpretation of stereoscopic aerial photography and field surveys. These methods are expensive and time consuming. The exploitation of new remote sensing data, in particular very high resolution (VHR) satellite images, and new dedicated methods present an alternative to the traditional methods and are at the forefront of modern landslide research. Recent studies have showed the possibility to produce accurate landslide maps, reducing the time and resources required for their compilation and systematic update. This paper presents the Multiple Change Detection (MCD) technique, a new method that has shown promising results in landslide mapping. Through supervised or unsupervised classifiers, MCD combines different algorithms of change detection metrics, such as change in Normalized Differential Vegetation Index, spectral angle, principal component analysis, and independent component analysis, and applies them to a multi-temporal set of VHR satellite images to distinguish new landslides from stable areas. MCD has been applied with success in different geographical areas and with different satellite images, suggesting it is a reliable and robust technique. The technique can distinguish old from new landslides and capture runout features. Results of these case studies will be presented in the conference. Also to be presented are new developments of MCD involving the introduction of a priori information on landslide susceptibility within a Bayesian framework.

  5. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.

  6. Visualization of Sedentary Behavior Using an Event-Based Approach

    ERIC Educational Resources Information Center

    Loudon, David; Granat, Malcolm H.

    2015-01-01

    Visualization is commonly used in the interpretation of physical behavior (PB) data, either in conjunction with or as precursor to formal analysis. Effective representations of the data can enable the identification of patterns of behavior, and how they relate to the temporal context in a single day, or across multiple days. An understanding of…

  7. An event-based architecture for solving constraint satisfaction problems

    PubMed Central

    Mostafa, Hesham; Müller, Lorenz K.; Indiveri, Giacomo

    2015-01-01

    Constraint satisfaction problems are ubiquitous in many domains. They are typically solved using conventional digital computing architectures that do not reflect the distributed nature of many of these problems, and are thus ill-suited for solving them. Here we present a parallel analogue/digital hardware architecture specifically designed to solve such problems. We cast constraint satisfaction problems as networks of stereotyped nodes that communicate using digital pulses, or events. Each node contains an oscillator implemented using analogue circuits. The non-repeating phase relations among the oscillators drive the exploration of the solution space. We show that this hardware architecture can yield state-of-the-art performance on random SAT problems under reasonable assumptions on the implementation. We present measurements from a prototype electronic chip to demonstrate that a physical implementation of the proposed architecture is robust to practical non-idealities and to validate the theory proposed. PMID:26642827

  8. Visual tracking using neuromorphic asynchronous event-based cameras.

    PubMed

    Ni, Zhenjiang; Ieng, Sio-Hoi; Posch, Christoph; Régnier, Stéphane; Benosman, Ryad

    2015-04-01

    This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly. PMID:25710087

  9. Event-based text mining for biology and functional genomics

    PubMed Central

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  10. Microstructure and corrosion behavior of hot-rolled GCr15 bearing steel

    NASA Astrophysics Data System (ADS)

    Fu, Junwei

    2016-04-01

    Microstructure, corrosion behavior and evolution of hot-rolled high-carbon-chromium bearing steel were investigated using scanning electron microscopy and energy dispersive spectrometer (EDS). The results show that corrosion initiates adjacent to the network carbide, which is the initial austenite grain boundary. With the further increase in corrosion time, corrosion fraction is increased and extended into the grains. Finally, the whole grain near the network carbide is etched off and the grain boundary is detached from the sample, which forms the corroded holes. Based on the EDS analyses, it is confirmed that this corrosion behavior is resulted from the depletion of Cr as solid solute at the grain boundary. The depletion of Cr is the result of the formation of Cr carbide near the grain boundary.

  11. SCR and GCR exposure ages of plagioclase grains from lunar soil

    NASA Technical Reports Server (NTRS)

    Etique, P.; Baur, H.; Signer, P.; Wieler, R.

    1986-01-01

    The concentrations of solar wind implanted Ar-36 in mineral grains extracted from lunar soils show that they were exposed to the solar wind on the lunar surface for an integrated time of 10E4 to 10E5 years. From the bulk soil 61501 plagioclase separates of 8 grain size ranges was prepared. The depletion of the implanted gases was achieved by etching aliquot samples of 4 grain sizes to various degrees. The experimental results pertinent to the present discussion are: The spallogenic Ne is, as in most plagioclases from lunar soils, affected by diffusive losses and of no use. The Ar-36 of solar wind origin amounts to (2030 + or - 100) x 10E-8 ccSTP/g in the 150 to 200 mm size fraction and shows that these grains were exposed to the solar wind for at least 10,000 years. The Ne-21/Ne-22 ratio of the spallogenic Ne is 0.75 + or - 0.01 and in very good agreement with the value of this ratio in a plagioclase separate from rock 76535. This rock has had a simple exposure history and its plagioclases have a chemical composition quite similar to those studied. In addition to the noble gases, the heavy particle tracks in an aliquot of the 150 to 200 mm plagioclase separate were investigated and found 92% of the grains to contain more than 10E8 tracks/sq cm. This corresponds to a mean track density of (5 + or - 1) x 10E8 tracks/sq cm. The exploration of the exposure history of the plagioclase separates from the soil 61501 do not contradict the model for the regolith dynamics but also fail to prove it.

  12. No-migration variance petition: Draft. Volume 4, Appendices DIF, GAS, GCR (Volume 1)

    SciTech Connect

    1995-05-31

    The Department of Energy is responsible for the disposition of transuranic (TRU) waste generated by national defense-related activities. Approximately 2.6 million cubic feet of the se waste have been generated and are stored at various facilities across the country. The Waste Isolation Pilot Plant (WIPP), was sited and constructed to meet stringent disposal requirements. In order to permanently dispose of TRU waste, the DOE has elected to petition the US EPA for a variance from the Land Disposal Restrictions of RCRA. This document fulfills the reporting requirements for the petition. This report is volume 4 of the petition which presents details about the transport characteristics across drum filter vents and polymer bags; gas generation reactions and rates during long-term WIPP operation; and geological characterization of the WIPP site.

  13. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 7: Appendix GCR Volume 2

    SciTech Connect

    1995-03-31

    This report contains the second part of the geological characterization report for the Waste Isolation Pilot Plant. Both hydrology and geochemistry are evaluated. The following aspects of hydrology are discussed: surface hydrology; ground water hydrology; and hydrology drilling and testing. Hydrologic studies at the site and adjacent site areas have concentrated on defining the hydrogeology and associated salt dissolution phenomena. The geochemical aspects include a description of chemical properties of geologic media presently found in the surface and subsurface environments of southeastern New Mexico in general, and of the proposed WIPP withdrawal area in particular. The characterization does not consider any aspect of artificially-introduced material, temperature, pressure, or any other physico-chemical condition not native to the rocks of southeastern New Mexico.

  14. Event-Based Plausibility Immediately Influences On-Line Language Comprehension

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional…

  15. The cognitive cost of event-based prospective memory in children.

    PubMed

    Leigh, Janet; Marcovitch, Stuart

    2014-11-01

    Prospective memory is the act of remembering to perform an action in the future, often after the presentation of a cue. However, processes involved in remembering the future intention might hinder performance on activities leading up to and surrounding the event in which an intention must be carried out. The current study was designed to assess whether young children who were asked to engage in prospective memory do so at a cost to current cognitive processing. Participants (4-, 5-, and 6-year-olds) either performed a simple ongoing selection task only (control condition) or performed the selection task with an embedded prospective memory task (experimental condition). Results revealed that children in the experimental condition were slower in the execution of the ongoing task relative to children in the control condition, lending support to the theory that children as young as 4 ears selectively allocate resources in an effort to succeed in multiple tasks. PMID:24853249

  16. Probing the possible trigger mechanisms of an equatorial plasma bubble event based on multistation optical data

    NASA Astrophysics Data System (ADS)

    Taori, A.; Parihar, N.; Ghodpage, R.; Dashora, N.; Sripathi, S.; Kherani, E. A.; Patil, P. T.

    2015-10-01

    We analyze an equatorial plasma bubble (EPB) event observed in optical 630 nm image data simultaneously from Gadanki (13.5°N, 79.2°E), Kolhapur (16.8°N, 74.2°E), India. The total electron content data from Gadanki together with the ionosonde data from an equatorial Indian station, Tirunelveli (8.7°N, 77.8°E) confirmed the association of observed EPB event with equatorial spread F (ESF). The optical 630 nm images from a farther low-latitude Indian station Ranchi (23.3°N, 85.3°E) show clear signatures of tilted east-west wave structures propagating toward equator. Further, the upward wave energy noted in mesospheric airglow data was found to be negligible. These data suggest that possibly the off-equatorial tilted east-west structures triggered the observed EPB/ESF event.

  17. Prediction problem for target events based on the inter-event waiting time

    NASA Astrophysics Data System (ADS)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  18. Event-based aquifer-to-atmosphere modeling over the European CORDEX domain

    NASA Astrophysics Data System (ADS)

    Keune, J.; Goergen, K.; Sulis, M.; Shrestha, P.; Springer, A.; Kusche, J.; Ohlwein, C.; Kollet, S. J.

    2014-12-01

    Despite the fact that recent studies focus on the impact of soil moisture on climate and especially land-energy feedbacks, groundwater dynamics are often neglected or conceptual groundwater flow models are used. In particular, in the context of climate change and the occurrence of droughts and floods, a better understanding and an improved simulation of the physical processes involving groundwater on continental scales is necessary. This requires the implementation of a physically consistent terrestrial modeling system, which explicitly incorporates groundwater dynamics and the connection with shallow soil moisture. Such a physics-based system enables simulations and monitoring of groundwater storage and enhanced representations of the terrestrial energy and hydrologic cycles over long time periods. On shorter timescales, the prediction of groundwater-related extremes, such as floods and droughts, are expected to improve, because of the improved simulation of components of the hydrological cycle. In this study, we present a fully coupled aquifer-to-atmosphere modeling system over the European CORDEX domain. The integrated Terrestrial Systems Modeling Platform, TerrSysMP, consisting of the three-dimensional subsurface model ParFlow, the Community Land Model CLM3.5 and the numerical weather prediction model COSMO of the German Weather Service, is used. The system is set up with a spatial resolution of 0.11° (12.5km) and closes the terrestrial water and energy cycles from aquifers into the atmosphere. Here, simulations of the fully coupled system are performed over events, such as the 2013 flood in Central Europe and the 2003 European heat wave, and over extended time periods on the order of 10 years. State and flux variables of the terrestrial hydrologic and energy cycle are analyzed and compared to both in situ (e.g. stream and water level gauge networks, FLUXNET) and remotely sensed observations (e.g. GRACE, ESA ICC ECV soil moisture and SMOS). Additionally, the presented modeling system may be useful in the assessment of groundwater-related uncertainties in virtual reality and scenario simulations.

  19. Tsunami Source Identification on the 1867 Tsunami Event Based on the Impact Intensity

    NASA Astrophysics Data System (ADS)

    Wu, T. R.

    2014-12-01

    The 1867 Keelung tsunami event has drawn significant attention from people in Taiwan. Not only because the location was very close to the 3 nuclear power plants which are only about 20km away from the Taipei city but also because of the ambiguous on the tsunami sources. This event is unique in terms of many aspects. First, it was documented on many literatures with many languages and with similar descriptions. Second, the tsunami deposit was discovered recently. Based on the literatures, earthquake, 7-meter tsunami height, volcanic smoke, and oceanic smoke were observed. Previous studies concluded that this tsunami was generated by an earthquake with a magnitude around Mw7.0 along the Shanchiao Fault. However, numerical results showed that even a Mw 8.0 earthquake was not able to generate a 7-meter tsunami. Considering the steep bathymetry and intense volcanic activities along the Keelung coast, one reasonable hypothesis is that different types of tsunami sources were existed, such as the submarine landslide or volcanic eruption. In order to confirm this scenario, last year we proposed the Tsunami Reverse Tracing Method (TRTM) to find the possible locations of the tsunami sources. This method helped us ruling out the impossible far-field tsunami sources. However, the near-field sources are still remain unclear. This year, we further developed a new method named 'Impact Intensity Analysis' (IIA). In the IIA method, the study area is divided into a sequence of tsunami sources, and the numerical simulations of each source is conducted by COMCOT (Cornell Multi-grid Coupled Tsunami Model) tsunami model. After that, the resulting wave height from each source to the study site is collected and plotted. This method successfully helped us to identify the impact factor from the near-field potential sources. The IIA result (Fig. 1) shows that the 1867 tsunami event was a multi-source event. A mild tsunami was trigged by a Mw7.0 earthquake, and then followed by the submarine landslide or volcanic events. A near-field submarine landslide and landslide at Mien-Hwa Canyon were the most possible scenarios. As for the volcano scenarios, the volcanic eruption located about 10 km away from Keelung with 2.5x108 m3 disturbed water volume might be a candidate. The detailed scenario results will be presented in the full paper.

  20. Conduction Velocity of the Uterine Contraction in Serial Magnetomyogram (MMG) Data: Event Based Simulation and Validation

    PubMed Central

    Preissl, Hubert; Lowery, Curtis L.; Eswaran, Hari; Govindan, Rathinaswamy B.

    2012-01-01

    We propose a novel approach to calculate the conduction velocity (CV) of the uterine contraction bursts in magnetomyogram (MMG) signals measured using a multichannel SQUID array. For this purpose, we partition the sensor coordinates into four different quadrants and identify the contractile bursts using a previously proposed Hilbert-wavelet transform approach. If contractile burst is identified in more than one quadrant, we calculate the center of gravity (CoG) in each quadrant for each time point as the sum of the product of the sensor coordinates with the Hilbert amplitude of the MMG signals normalized by the sum of the Hilbert amplitude of the signals over all sensors. Following this we compute the delay between the CoGs of all (six) possible quadrant pairs combinations. As a first step, we validate this approach by simulating a stochastic model based on independent second-order autoregressive processes (AR2) and we divide them into 30 second disjoint windows and insert burst activity at specific time instances in preselected sensors. Also we introduce a lag of 5 ± 1 seconds between different quadrants. Using our approach we calculate the CoG of the signals in a quadrant. To this end, we compute the delay between CoGs obtained from different quadrants and show that our approach is able to reliably capture the delay incorporated in the model. We apply the proposed approach to 19 serial MMG data obtained from two subjects and show an increase in the CV as the subjects approached labor. PMID:22255713

  1. A mobile robots experimental environment with event-based wireless communication.

    PubMed

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-01-01

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented. PMID:23881139

  2. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    PubMed

    Hao, Xiaoqing; An, Haizhong; Zhang, Lijia; Li, Huajiao; Wei, Guannan

    2015-01-01

    To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P), weakly positive (p), neutral (o), weakly negative (n), and strongly negative (N). These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP) with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism. PMID:26462230

  3. An event-based neural network architecture with an asynchronous programmable synaptic memory.

    PubMed

    Moradi, Saber; Indiveri, Giacomo

    2014-02-01

    We present a hybrid analog/digital very large scale integration (VLSI) implementation of a spiking neural network with programmable synaptic weights. The synaptic weight values are stored in an asynchronous Static Random Access Memory (SRAM) module, which is interfaced to a fast current-mode event-driven DAC for producing synaptic currents with the appropriate amplitude values. These currents are further integrated by current-mode integrator synapses to produce biophysically realistic temporal dynamics. The synapse output currents are then integrated by compact and efficient integrate and fire silicon neuron circuits with spike-frequency adaptation and adjustable refractory period and spike-reset voltage settings. The fabricated chip comprises a total of 32 × 32 SRAM cells, 4 × 32 synapse circuits and 32 × 1 silicon neurons. It acts as a transceiver, receiving asynchronous events in input, performing neural computation with hybrid analog/digital circuits on the input spikes, and eventually producing digital asynchronous events in output. Input, output, and synaptic weight values are transmitted to/from the chip using a common communication protocol based on the Address Event Representation (AER). Using this representation it is possible to interface the device to a workstation or a micro-controller and explore the effect of different types of Spike-Timing Dependent Plasticity (STDP) learning algorithms for updating the synaptic weights values in the SRAM module. We present experimental results demonstrating the correct operation of all the circuits present on the chip. PMID:24681923

  4. Event based analysis of chlorothalonil concentrations following application to managed turf.

    PubMed

    King, Kevin W; Balogh, James C

    2013-03-01

    Chlorothalonil concentrations exceeding acute toxicity levels for certain organisms have been measured in surface water discharge events from managed turf watersheds. The duration of exceedence and the timing of these events related to precipitation/runoff and time since application, however, have not been explored. Chlorothalonil concentrations were measured from discharge waters draining a managed turf watershed in Duluth, Minnesota, USA, between 2003 and 2009. The median chlorothalonil concentration was 0.58 µg/L. Approximately 2% of all measured concentrations exceeded the 7.6 µg/L median lethal concentration (LC50) acute toxicity level for rainbow trout. One-twentieth the LC50 concentration, equivalent to the level of concern (0.38 µg/L) for endangered species, was exceeded 31% of the time during the present study. The concentrations that exceeded the LC50 threshold were associated with eight rainfall/runoff events. Low dose exposures are a more important biological concern than acute occurrences. Exceedence concentrations associated with acute effects were significantly (p < 0.05) correlated to time since application and were measured only in the fall following extensive application. A conflict exists between the transportability of chlorothalonil as suggested by its chemical properties and the data collected in the present study. With respect to course-wide golf course application, avoiding application until after the major autumn rainfall period but before the first snow coverage is recommended to reduce occurrence of chlorothalonil concentrations that exceed toxic levels associated with acute and chronic levels of concern. PMID:23233324

  5. Event-based prediction of stream turbidity using a combined cluster analysis and classification tree approach

    NASA Astrophysics Data System (ADS)

    Mather, Amanda L.; Johnson, Richard L.

    2015-11-01

    Stream turbidity typically increases during streamflow events; however, similar event hydrographs can produce markedly different event turbidity behaviors because many factors influence turbidity in addition to streamflow, including antecedent moisture conditions, season, and supply of turbidity-causing materials. Modeling of sub-hourly turbidity as a function of streamflow shows that event model parameters vary on an event-by-event basis. Here we examine the extent to which stream turbidity can be predicted through the prediction of event model parameters. Using three mid-sized streams from the Mid-Atlantic region of the U.S., we show the model parameter set for each event can be predicted based on the event characteristics (e.g., hydrologic, meteorologic and antecedent moisture conditions) using a combined cluster analysis and classification tree approach. The results suggest that the ratio of beginning event discharge to peak event discharge (an estimate of the event baseflow index), as well as catchment antecedent moisture, are important factors in the prediction of event turbidity. Indicators of antecedent moisture, particularly those derived from antecedent discharge, account for the majority of the splitting nodes in the classification trees for all three streams. For this study, prediction of turbidity during streamflow events is based upon observed data (e.g., measured streamflow, precipitation and air temperature). However, the results also suggest that the methods presented here can, in future work, be used in conjunction with forecasts of streamflow, precipitation and air temperature to forecast stream turbidity.

  6. Topology and signatures of a model for flux transfer events based on vortex-induced reconnection

    SciTech Connect

    Liu, Z.X.; Zhu, Z.W.; Li, F. ); Pu, Z.Y. )

    1992-12-01

    A model of the disturbed magnetic field and disturbed velocity of flux transfer events (FTEs) is deduced on the basis of the vortex-induced reconnection theory. The topology and signatures of FTEs are calculated and discussed. The authors propose that the observed forms of FTE signatures depend on the motional direction of the FTE tube, the positions of the spacecraft relative to the passing FTE tube, and which part of the FTE tube (the magnetosphere part, the magnetopause part, or the magnetosheath part) the spacecraft is passing through. It is found that when a FTE tube moves from south to north along a straight line in the northern hemisphere, positive FTEs appear for most passages; however, reverse FTEs are also observed occasionally while the signatures of B[sub Z] (B[sub L]) appear as a single peak, and the irregular FTEs always correspond to oblique line motions of the FTE tube. The velocity signatures are similar to those of the magnetic field, but in the northern hemisphere their directions are all just opposite to the magnetic field. The calculated results for the magnetic field are compared with 61 observed FTEs. The observed signatures (B[sub N] and B[sub L]) of 52 FTEs are consistent with the calculations. The results indicate that a majority of observed FTEs correspond to passages of spacecraft through the edges of FTE tubes.

  7. Extraction of spatio-temporal information of earthquake event based on semantic technology

    NASA Astrophysics Data System (ADS)

    Fan, Hong; Guo, Dan; Li, Huaiyuan

    2015-12-01

    In this paper a web information extraction method is presented which identifies a variety of thematic events utilizing the event knowledge framework derived from text training, and then further uses the syntactic analysis to extract the event key information. The method which combines the text semantic information and domain knowledge of the event makes the extraction of information people interested more accurate. In this paper, web based earthquake news extraction is taken as an example. The paper firstly briefs the overall approaches, and then details the key algorithm and experiments of seismic events extraction. Finally, this paper conducts accuracy analysis and evaluation experiments which demonstrate that the proposed method is a promising way of hot events mining.

  8. Classification of cyclogenesis events based on a comprehensive set of potential precursors

    NASA Astrophysics Data System (ADS)

    Graf, M.; Sprenger, M.; Wernli, H.

    2012-04-01

    Case studies indicate a large variability of the involved atmospheric structures and physical processes responsible for cyclogenesis. Historical classifications focus on the relative importance of low-level baroclinicity and upper-level disturbances, and a more recent threefold classification also considers the role of diabatically produced low-tropospheric potential vorticity. In this study, a large set of potential precursors for cyclogenesis will be systematically investigated on a statistical basis. Cyclones are objectively identified during 2010 in the operational analyses and deterministic forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF) and then tracked along their life cycle. The starting points of these tracks are considered as the location of cyclogenesis. In the environment of these locations a set of about 20 precursors is determined. The set includes the following parameters: (a) temperature and heat fluxes at the surface; (b) characteristic conditions in the troposphere (e.g., integrated water vapor, amplitude of low-level potential vorticity); (c) measures of baroclinic and convective stability (e.g., Eady growth rate and CAPE); and (d) flow patterns at and forcing from the tropopause level (e.g., jet streams and streaks, potential vorticity anomalies, height of the tropopause). In addition to these relatively simple Eulerian characteristics, more advanced diagnostic approaches will be applied, including a Lagrangian moisture source diagnostic and quasigeostrophic omega forcing. These parameters will be determined for a multitude of cyclones and build the basis for an in-depth statistical analysis in this precursor phase space. A clustering approach will be applied to this precursor phase space in order to determine the main categories of cyclogenesis, and their geographical and seasonal variability.

  9. Earthquake source inversion for moderate magnitude seismic events based on GPS simulated high-rate data

    NASA Astrophysics Data System (ADS)

    Psimoulis, Panos; Dalguer, Luis; Houlie, Nicolas; Zhang, Youbing; Clinton, John; Rothacher, Markus; Giardini, Domenico

    2013-04-01

    The development of GNSS technology with the potential of high-rate (up to 100Hz) GNSS (GPS, GLONASS, Galileo, Compass) records allows the monitoring of the seismic ground motions. In this study we show the potential of estimating the earthquake magnitude (Mw) and the fault geometry parameters (slip, depth, length, rake, dip, strike) during the propagation of seismic waves based on high-rate GPS network data and using a non-linear inversion algorithm. The examined area is the Valais (South-West Switzerland) where a permanent GPS network of 15 stations (COGEAR and AGNES GPS networks) is operational and where the occurrence of an earthquake of Mw≈6 is possible every 80 years. We test our methodology using synthetic events of magnitude 6.0-6.5 corresponding to normal fault according to most of the fault mechanisms of the area, for surface and buried rupture. The epicentres are located in the Valais close to the epicentre of previous historical earthquakes. For each earthquake, synthetic seismic data (velocity records) of 15 sites, corresponding to the current GPS network sites in Valais, were produced. The synthetic seismic data were integrated into displacement time-series. By jointly using these time-series with the Bernese GNSS Software 5.1 (modified), 10Hz sampling rate GPS records were generated assuming a noise of peak-to-peak amplitudes of ±1cm and ±3cm for the horizontal and for the vertical components, respectively. The GPS records were processed and resulted in kinematic time series from where the seismic displacements were derived and inverted for the magnitude and the fault geometry parameters. The inversion results indicate that it is possible to estimate both, the earthquake magnitudes and the fault geometry parameters in real-time (~10 seconds after the fault rupture). The accuracy of the results depends on the geometry of the GPS network and of the position of the earthquake epicentre.

  10. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    NASA Technical Reports Server (NTRS)

    Adams, M.; Gallagher, D. L.; Whitt, A.; Six, N. Frank (Technical Monitor)

    2002-01-01

    For the past four years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of science communication through the web resources on the Internet. The program includes extended stories about NAS.4 science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. We give here, examples of events, problems, and lessons learned from these activities.

  11. A study of preservice elementary teachers enrolled in a discrepant-event-based physical science class

    NASA Astrophysics Data System (ADS)

    Lilly, James Edward

    This research evaluated the POWERFUL IDEAS IN PHYSICAL SCIENCE (PIiPS) curriculum model used to develop a physical science course taken by preservice elementary teachers. The focus was on the evaluation of discrepant events used to induce conceptual change in relation to students' ideas concerning heat, temperature, and specific heat. Both quantitative and qualitative methodologies were used for the analysis. Data was collected during the 1998 Fall semester using two classes of physical science for elementary school teachers. The traditionally taught class served as the control group and the class using the PIiPS curriculum model was the experimental group. The PIiPS curriculum model was evaluated quantitatively for its influence on students' attitude toward science, anxiety towards teaching science, self efficacy toward teaching science, and content knowledge. An analysis of covariance was performed on the quantitative data to test for significant differences between the means of the posttests for the control and experimental groups while controlling for pretest. It was found that there were no significant differences between the means of the control and experimental groups with respect to changes in their attitude toward science, anxiety toward teaching science and self efficacy toward teaching science. A significant difference between the means of the content examination was found (F(1,28) = 14.202 and p = 0.001), however, the result is questionable. The heat and energy module was the target for qualitative scrutiny. Coding for discrepant events was adapted from Appleton's 1996 work on student's responses to discrepant event science lessons. The following qualitative questions were posed for the investigation: (1) what were the ideas of the preservice elementary students prior to entering the classroom regarding heat and energy, (2) how effective were the discrepant events as presented in the PIiPS heat and energy module, and (3) how much does the "risk taking factor" associated with not telling the students the answer right away, affect the learning of the material. It was found that preservice elementary teachers harbor similar preconceptions as the general population according to the literature. The discrepant events used in this module of the PIiPS curriculum model met with varied results. It appeared that those students who had not successfully confronted their preconceptions were less likely to accept the new concepts that were to be developed using the discrepant events. Lastly, students had shown great improvement in content understanding and developed the ability to ask deep and probing questions.

  12. Event-based Modeling of Fecal Coliform Concentrations in Runoff from Manured Fields

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative evaluation of the effect of field manure application on bacterial concentrations in creeks adjacent to the field requires developing microbial transport models. Reliable testing of such models with bacteria monitoring data requires a better understanding and estimation of the uncertaint...

  13. Event-Based Monitoring of Sediment Flux Following Removal of Oregon's Marmot Dam

    NASA Astrophysics Data System (ADS)

    Major, J. J.; O'Connor, J. E.; Spicer, K. R.; Bragg, H. M.; Wallick, J. R.; Kittleson, R. L.; Lee, K. K.; Cushman, D.; Piatt, D.; Tanner, D. Q.; Hale, T.; Uhrich, M. A.; Rhode, A.

    2008-12-01

    Breaching of Oregon's Marmot Dam in October 2007 allowed the 80-km-long Sandy River to flow freely from Mount Hood to the Columbia River for the first time in nearly 100 years. When breached, the dam was brimful with sediment. As part of an analysis examining the redistribution of ~730,000 m3 of stored sediment following the dam removal, we measured suspended-sediment load and bedload at sites 10 km upstream and 0.5 to 18 km downstream of the dam before, during and after breaching, and during five subsequent high-water events. Prior to breaching of the dam, suspended-sediment and bedload mass fluxes along the Sandy River both upstream and downstream of the dam were of the order of a few to a few tens of kg/s. Suspended sediment upstream was composed chiefly of sand in contrast to mostly silt and clay passing measurement sites 0.5 and 18 km below the dam. In all reaches bedload consisted chiefly (>90%) of sand. Breaching of the dam released a pulse of turbid water having an instantaneous suspended-sediment flux of 5200 kg/s. The initial sediment pulse consisted predominantly of silt and clay, presumably eroded from thin, fine-grained topset beds at the downstream end of the reservoir. However, the suspended load coarsened rapidly as the Sandy River incised into the stored sand and gravel that filled the former reservoir. Following the initial peak value, median fluxes of sandy suspended sediment 0.5 km below the dam site hovered around several tens to hundreds of kg/s for at least 24 hours, whereas the median suspended- sediment flux remained about 30 kg/s both 10 km upstream and 18 km downstream. Bedload transport also increased following breaching, but its response was slower than for suspended sediment. Bedload flux 0.5 km below the dam site increased from ~1 kg/s before breaching to 60 kg/s by 6 hours and to about 70 kg/s by 18 hours after breaching, in contrast to the steady, low (<10 kg/s) flux of sandy bedload passing upstream and farther downstream before and after breaching. Initially, the near-field bedload consisted predominantly of sand transported in large dunes. Significant gravel transport did not begin until 18 to 20 hours after breaching, in conjunction with rapid bed aggradation and downstream propagation of mid- channel gravel bars. This enhanced sediment transport occurred under a median flow just 30% greater than the river's mean annual flow at Marmot Dam. Within 3 months of breaching, the near-field high-flow-driven bedload flux remained significantly elevated above both upstream and downstream fluxes, but the suspended-sediment flux had declined substantially. Near-field bedload flux was persistently 10 to 100 times greater than that upstream and farther downstream, and remained gravel-rich compared to the sandy bedload passing stations upstream and 18 km distant. In contrast, near-field suspended-sediment concentrations declined approximately logarithmically, and by January 2008 the associated sandy suspended-sediment flux was comparable in both composition and magnitude to the suspended-sediment flux 18 km distant. The newly energetic Sandy River thus rapidly flushed sandy suspended-sediment downstream. Gravel-rich bedload continues to disperse downstream, but has yet to reach distal reaches of the river system. The majority of gravel transported thus far is stored chiefly along the 2-km-long channel reach below the dam site and within the Sandy River gorge 2-8 km downstream from the dam site.

  14. Folk Theorems on the Correspondence between State-Based and Event-Based Systems

    NASA Astrophysics Data System (ADS)

    Reniers, Michel A.; Willemse, Tim A. C.

    Kripke Structures and Labelled Transition Systems are the two most prominent semantic models used in concurrency theory. Both models are commonly believed to be equi-expressive. One can find many ad-hoc embeddings of one of these models into the other. We build upon the seminal work of De Nicola and Vaandrager that firmly established the correspondence between stuttering equivalence in Kripke Structures and divergence-sensitive branching bisimulation in Labelled Transition Systems. We show that their embeddings can also be used for a range of other equivalences of interest, such as strong bisimilarity, simulation equivalence, and trace equivalence. Furthermore, we extend the results by De Nicola and Vaandrager by showing that there are additional translations that allow one to use minimisation techniques in one semantic domain to obtain minimal representatives in the other semantic domain for these equivalences.

  15. Team-Teaching a Current Events-Based Biology Course for Nonmajors

    ERIC Educational Resources Information Center

    Bondos, Sarah E.; Phillips, Dereth

    2008-01-01

    Rice University has created a team-taught interactive biology course for nonmajors with a focus on cutting edge biology in the news--advances in biotechnology, medicine, and science policy, along with the biological principles and methodology upon which these advances are based. The challenges inherent to teaching current topics were minimized by…

  16. A Mobile Robots Experimental Environment with Event-Based Wireless Communication

    PubMed Central

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-01-01

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented. PMID:23881139

  17. Co-design of H∞ jump observers for event-based measurements over networks

    NASA Astrophysics Data System (ADS)

    Peñarrocha, Ignacio; Dolz, Daniel; Romero, Julio Ariel; Sanchis, Roberto

    2016-01-01

    This work presents a strategy to minimise the network usage and the energy consumption of wireless battery-powered sensors in the observer problem over networks. The sensor nodes implement a periodic send-on-delta approach, sending new measurements when a measure deviates considerably from the previous sent one. The estimator node implements a jump observer whose gains are computed offline and depend on the combination of available new measurements. We bound the estimator performance as a function of the sending policies and then state the design procedure of the observer under fixed sending thresholds as a semidefinite programming problem. We address this problem first in a deterministic way and, to reduce conservativeness, in a stochastic one after obtaining bounds on the probabilities of having new measurements and applying robust optimisation problem over the possible probabilities using sum of squares decomposition. We relate the network usage with the sending thresholds and propose an iterative procedure for the design of those thresholds, minimising the network usage while guaranteeing a prescribed estimation performance. Simulation results and experimental analysis show the validity of the proposal and the reduction of network resources that can be achieved with the stochastic approach.

  18. An events based algorithm for distributing concurrent tasks on multi-core architectures

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Williams, John R.; Tilke, Peter

    2010-02-01

    In this paper, a programming model is presented which enables scalable parallel performance on multi-core shared memory architectures. The model has been developed for application to a wide range of numerical simulation problems. Such problems involve time stepping or iteration algorithms where synchronization of multiple threads of execution is required. It is shown that traditional approaches to parallelism including message passing and scatter-gather can be improved upon in terms of speed-up and memory management. Using spatial decomposition to create orthogonal computational tasks, a new task management algorithm called H-Dispatch is developed. This algorithm makes efficient use of memory resources by limiting the need for garbage collection and takes optimal advantage of multiple cores by employing a "hungry" pull strategy. The technique is demonstrated on a simple finite difference solver and results are compared to traditional MPI and scatter-gather approaches. The H-Dispatch approach achieves near linear speed-up with results for efficiency of 85% on a 24-core machine. It is noted that the H-Dispatch algorithm is quite general and can be applied to a wide class of computational tasks on heterogeneous architectures involving multi-core and GPGPU hardware.

  19. Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform

    PubMed Central

    Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B.

    2016-01-01

    Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks. PMID:26909015

  20. Event-Based Modeling of Driver Yielding Behavior at Unsignalized Crosswalks

    PubMed Central

    Schroeder, Bastian J.; Rouphail, Nagui M.

    2011-01-01

    This research explores factors associated with driver yielding behavior at unsignalized pedestrian crossings and develops predictive models for yielding using logistic regression. It considers the effect of variables describing driver attributes, pedestrian characteristics and concurrent conditions at the crosswalk on the yield response. Special consideration is given to ‘vehicle dynamics constraints’ that form a threshold for the potential to yield. Similarities are identified to driver reaction in response to the ‘amber’ indication at a signalized intersection. The logit models were developed from data collected at two unsignalized mid-block crosswalks in North Carolina. The data include ‘before’ and ‘after’ observations of two pedestrian safety treatments, an in-street pedestrian crossing sign and pedestrian-actuated in-roadway warning lights. The analysis suggests that drivers are more likely to yield to assertive pedestrians who walk briskly in their approach to the crosswalk. In turn, the yield probability is reduced with higher speeds, deceleration rates and if vehicles are traveling in platoons. The treatment effects proved to be significant and increased the propensity of drivers to yield, but their effectiveness may be dependent on whether the pedestrian activates the treatment. The results of this research provide new insights on the complex interaction of pedestrians and vehicles at unsignalized intersections and have implications for future work towards predictive models for driver yielding behavior. The developed logit models can provide the basis for representing driver yielding behavior in a microsimulation modeling environment. PMID:21852892

  1. Event based analysis of Chlorothalonil concentrations following application to managed turf

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Chlorothalonil concentrations exceeding acute toxicity levels for certain organisms have been measured in surface water discharge events from managed turf watersheds. However, the duration of exceedence and the timing of those events with respect to precipitation/runoff and time since application ha...

  2. Bonsai: an event-based framework for processing and controlling data streams.

    PubMed

    Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P; Atallah, Bassam V; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M; Correia, Patrícia A; Medina, Roberto E; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J; Kampff, Adam R

    2015-01-01

    The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861

  3. Bonsai: an event-based framework for processing and controlling data streams

    PubMed Central

    Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.

    2015-01-01

    The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861

  4. Iterative and Event-Based Frameworks for University and School District Technology Professional Development Partnerships

    ERIC Educational Resources Information Center

    Winslow, Joseph; Dickerson, Jeremy; Weaver, Carmen; Josey, Fair

    2016-01-01

    Forming technology partnerships between universities and public schools in an era of competition and economic difficulty is a challenge. However, when these partnerships are formed and sustained, the benefits for both are extremely valuable. For a university instructional technology graduate program and school partnership to be successful, the…

  5. Prediction of consequences of meteor events based on atmospheric trajectory analysis

    NASA Astrophysics Data System (ADS)

    Kuznetsova, D.; Gritsevich, M.; Christou, A.

    2014-07-01

    In this study, we develop a model which describes how meteoroid enter the atmosphere of a planet, and categorize different consequences of the collisions of cosmic bodies with the atmosphere and the surface of a planet. We focus on two types of possible results: (1) meteorite fall, when a fragment of a meteoroid can be found on the surface, and (2) full ablation, when meteoroid does not reach the ground. The model is based on the analytical solution of the classical equations of meteor-body deceleration [1,2]. The dimensionless solution for the mass-velocity dependence and the height-velocity dependence can be expressed using two main dimensionless parameters: the ballistic coefficient, which shows the ratio between the mass of the atmospheric column along the trajectory and the body's pre-entry mass, and the mass loss parameter, which is proportional to the ratio between the initial kinetic energy of the body and energy required to insure total mass loss of the body due to ablation and fragmentation. Thus, every given meteoroid case is described by a pair of these parameters. To distinguish the two possible impact consequences (meteorite fall or full ablation) we use the meteorite fall condition: the terminal mass of a meteoroid exceeds or is equal to a certain chosen value. This condition can be written using the parameters introduced above. Thus, we get a boundary curve in the parameter plane and associate different events with the location of the point relative to this curve. This theory is applied in the classification of collisions of cosmic bodies with the Earth's atmosphere and surface. The observational data are used to calculate the values of the parameters used in current study, and these values are shown in the parameter plane and their locations are compared against the location of boundary curve in each case. The obtained results show a good agreement with the known consequences for the observed fireballs, including ones registered by the Canadian, Prairie and European Fireball Networks [3,4]. As an extension of this theory, we model the meteoroid entry into the Martian atmosphere using introduced parameters. A number of investigations by different authors show an increasing interest to this subject, e.g. [5--9]. To apply our theory, we take two meteoroid types as an example: a chondrite with the entry velocity 10 km/s, and an iron meteoroid with the entry velocity 15 km/s. For each type, we take several pre-entry mass values and show the impact consequences by constructing the boundary curve on the parameter plane and the point corresponding to the meteoroid. These results are also compared with the meteoroid entries into the terrestrial atmosphere with the same pre-entry characteristics. It is shown that for some pre-entry mass range, a meteoroid would be fully ablated for the case of Earth, but a fraction of it would reach the surface for the case of Mars.

  6. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network

    PubMed Central

    Hao, Xiaoqing; An, Haizhong; Zhang, Lijia; Li, Huajiao; Wei, Guannan

    2015-01-01

    To study the sentiment diffusion of online public opinions about hot events, we collected people’s posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P), weakly positive (p), neutral (o), weakly negative (n), and strongly negative (N). These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP) with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts’ sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people’s sentiments regarding online hot events according to the sentiment diffusion mechanism. PMID:26462230

  7. Motion Entropy Feature and Its Applications to Event-Based Segmentation of Sports Video

    NASA Astrophysics Data System (ADS)

    Chen, Chen-Yu; Wang, Jia-Ching; Wang, Jhing-Fa; Hu, Yu-Hen

    2008-12-01

    An entropy-based criterion is proposed to characterize the pattern and intensity of object motion in a video sequence as a function of time. By applying a homoscedastic error model-based time series change point detection algorithm to this motion entropy curve, one is able to segment the corresponding video sequence into individual sections, each consisting of a semantically relevant event. The proposed method is tested on six hours of sports videos including basketball, soccer, and tennis. Excellent experimental results are observed.

  8. Probabilistic forecasting of extreme weather events based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert

    2016-04-01

    Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.

  9. Quantifying Future Changes in Extreme Precipitation Events Based on Resolved Synoptic Atmospheric Patterns

    NASA Astrophysics Data System (ADS)

    Gao, X.; Schlosser, C. A.; Monier, E.; Entekhabi, D.

    2012-12-01

    An important question for climate change science is possible shifts in the extremes of regional water cycle, especially changes in patterns, intensity and/or frequency of extreme precipitation events. In this study, an analogue method is developed to help detect extreme precipitation events and their potential changes under future climate regimes without relying on the highly uncertain modeled precipitation. Our approach is based on the use of composite maps to identify the distinct synoptic and large-scale atmospheric conditions that lead to extreme precipitation events at local scales. The analysis of extreme daily precipitation events, exemplified in the south-central United States, is carried out using 62-yr (1948-2010) CPC gridded station data and NASA's Modern Era Retrospective-analysis for Research and Applications (MERRA). Various aspects of the daily extremes are examined, including their historical ranking, associated common circulation features at upper and lower levels of the atmosphere, and moisture plumes. The scheme is first evaluated for the multiple climate model simulations of the 20th century from Coupled Model Intercomparison Project Phase 5 (CMIP5) archive to determine whether the statistical nature of modeled precipitation events (i.e. the numbers of occurrences over each season) could well correspond to that of the observed. Further, the approach will be applied to the CMIP5 multi-model projections of various climate change scenarios (i.e. Representative Concentration Pathways (RCP) scenarios) in the next century to assess the potential changes in the probability of extreme precipitation events. The research results from this study should be of particular significance to help society develop adaptive strategies and prevent catastrophic losses.

  10. Event-based stormwater quality and quantity loadings from elevated urban infrastructure affected by transportation.

    PubMed

    Sansalone, John J; Hird, Jonathan P; Cartledge, Frank K; Tittlebaum, Marty E

    2005-01-01

    Urban-rainfall runoff affected by transportation is a complex matrix of a very wide gradation of particulate matter (< 1 to > 10 000 microm) and dissolved inorganic and organic constituents. Particulate matter transported by rainfall runoff can be a significant vector for many reactive particulate-bound constituents, particularly metal elements. The water quality and hydrology of nine events from a representative elevated section of Interstate 10 (I-10) (eastbound average daily traffic load of 70 400 vehicles) in Baton Rouge, Louisiana, were characterized and compared with respect to the passage of each hydrograph. Residence time on the paved concrete surface was less than 30 minutes for all events. Results indicate that event-mean concentrations (EMCs) of particulate matter as total-suspended solids (TSS) (138 to 561 mg/L) and chemical-oxygen demand (COD) (128 to 1440 mg/L) were greater than those found in untreated municipal wastewater from the same service area. Particulate-matter dissolution and COD partitioned as a function of pH, pavement residence time, and organic content. In general, delivery of mass for aggregate indices, such as particulate matter (measured as TSS) and COD mass, were driven by the hydrology of the event, while concentrations of aggregate-constituent measurements, such as total-dissolved solids (TDS), illustrated an exponential-type decline during the rising limb of the hydrograph. Despite the short residence times, wide solids gradation, partitioning, and complexity of the rainfall-runoff chemistry, conductivity and dissolved solids were strongly correlated. Characterization of the transport and loads of constituents in urban-rainfall runoff, as a function of hydrology, is a necessary first step when considering treatability, structural or nonstructural controls, and mass trading for discharges from paved infrastructure. PMID:16121503