Science.gov

Sample records for boson mapping approach

  1. Composite boson mapping for lattice boson systems.

    PubMed

    Huerga, Daniel; Dukelsky, Jorge; Scuseria, Gustavo E

    2013-07-26

    We present a canonical mapping transforming physical boson operators into quadratic products of cluster composite bosons that preserves matrix elements of operators when a physical constraint is enforced. We map the 2D lattice Bose-Hubbard Hamiltonian into 2×2 composite bosons and solve it within a generalized Hartree-Bogoliubov approximation. The resulting Mott insulator-superfluid phase diagram reproduces well quantum Monte Carlo results. The Higgs boson behavior in the superfluid phase along the unit density line is unraveled and in remarkable agreement with experiments. Results for the properties of the ground and excited states are competitive with other state-of-the-art approaches, but at a fraction of their computational cost. The composite boson mapping here introduced can be readily applied to frustrated many-body systems where most methodologies face significant hurdles. PMID:23931383

  2. Dyson boson mapping of effective bi-fermion Hamiltonians

    SciTech Connect

    Civitarese, O.; Geyer, H.B.; Reboiro, M.

    2006-03-15

    Implementation of Dyson boson mapping is discussed in connection with effective Hamiltonians. A feature of the mapping technique, when implemented in an ideal boson basis, is the possible appearance of spurious states. These spurious states typically signal the overcompleteness of the basis. Without truncation, no contamination of the physical states and spectrum takes place. However, in practice one may be required to select from the ideal boson basis the dominant components for a given interaction. It is shown that the correspondence between a perturbative expansion, ala Bloch-Horowitz, and Dyson boson mapping allows for the identification of spurious states. The proposed method is applied to the mapping of a bi-fermionic Hamiltonian.

  3. Schwinger boson approach to the fully screened Kondo model.

    PubMed

    Rech, J; Coleman, P; Zarand, G; Parcollet, O

    2006-01-13

    We apply the Schwinger boson scheme to the fully screened Kondo model and generalize the method to include antiferromagnetic interactions between ions. Our approach captures the Kondo crossover from local moment behavior to a Fermi liquid with a nontrivial Wilson ratio. When applied to the two-impurity model, the mean-field theory describes the "Varma-Jones" quantum phase transition between a valence bond state and a heavy Fermi liquid.

  4. Surface Kondo Impurities in the Slave-Boson Approach

    NASA Astrophysics Data System (ADS)

    Anda, Enrique; Vernek, Edson

    2005-03-01

    Transport properties of magnetic impurities on surfaces have captured a great deal of attention lately. Atom manipulation and topographic imaging techniques using scanning tunneling microscope have confirmed some theoretical predictions on Kondo physics and at the same time revealed other interesting behavior in these systems. For example, experiments have reported unexpectedly high Kondo temperatures for multi-impurity and molecular structures on metallic surfaces. Motivated by these experimental results we apply slave boson techniques for finite Coulomb interaction (finite U) to study the transport properties of magnetic impurities on a metallic surface in the Kondo regime. We report here on our studies of the role of fluctuations on the slave boson number for the case of one impurity on metallic surfaces. We compare our results to other theoretical approaches and to experimental results. Supported by CAPES-Brazil and NSF-IMC and NSF-NIRT.

  5. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  6. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  7. Mean-field plus various types of pairing models and an exact boson mapping of the standard pairing model

    SciTech Connect

    Pan Feng; Wang Yin; Guan Xin; Jia Lu; Chen Xiangrong; Draayer, J. P.

    2011-06-28

    Exact solutions of Nilsson mean-field with various pairing interactions are reviewed. Some even-odd mass differences and moments of inertia of low-lying states for rare earth and actinide nuclei are calculated for the nearest-orbit pairing approximation as well as for the extended pairing model and compared to available experimental data. An exact boson mapping of the standard pairing Hamiltonian is also reported. Under the mapping, fermion pair operators are mapped exactly onto corresponding bosons. The image of the mapping is a Bose-Hubbard model with orbit-dependent hopping.

  8. Real-time dynamics in a strongly interacting bosonic hopping model: global quenches and mapping to the XX chain

    NASA Astrophysics Data System (ADS)

    Pozsgay, Balázs; Eisler, Viktor

    2016-05-01

    We study the time evolution of an integrable many-particle system, described by the $q$-boson Hamiltonian in the limit of strong interactions $q\\to\\infty$. It is shown that, for a particular class of pure initial states, the analytical calculation of certain observables simplifies considerably. Namely, we provide exact formulas for the calculation of the Loschmidt-echo and the emptiness formation probability, where the computational time scales polynomially with the particle number. Moreover, we construct a non-local mapping of the $q$-boson model to the XX spin chain, and show how this can be utilized to obtain the time evolution of various local bosonic observables for translationally invariant initial states. The results obtained via the bosonic and fermionic picture show perfect agreement. In the infinite volume and large time limits, we rigorously verify the prediction of the Generalized Gibbs Ensemble for homogeneous initial Fock states.

  9. X-slave boson approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2001-05-01

    The periodic anderson model (PAM) in the limit U=∞, can be studied by employing the Hubbard X operators to project out the unwanted states. In a previous work, we have studied the cumulant expansion of this Hamiltonian employing the hybridization as a perturbation, but probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the "chain approximation (CHA)" are employed. To consider this problem, we use a technique similar to the one employed by Coleman to treat the same problem with slave-bosons in the mean-field approximation. Assuming a particular renormalization for hybridization, we obtain a description that avoids an unwanted phase transition that appears in the mean-field slave-boson method at intermediate temperatures.

  10. X-boson cumulant approach to the periodic Anderson model

    NASA Astrophysics Data System (ADS)

    Franco, R.; Figueira, M. S.; Foglio, M. E.

    2002-07-01

    The periodic Anderson model can be studied in the limit U=∞ by employing the Hubbard X operators to project out the unwanted states. We had already studied this problem by employing the cumulant expansion with the hybridization as perturbation, but the probability conservation of the local states (completeness) is not usually satisfied when partial expansions like the ``chain approximation'' (CHA) are employed. To rectify this situation, we modify the CHA by employing a procedure that was used in the mean-field approximation of Coleman's slave-boson method. Our technique reproduces the features of that method in its region of validity, but avoids the unwanted phase transition that appears in the same method both when μ>>Ef at low T and for all values of the parameters at intermediate temperatures. Our method also has a dynamic character that is absent from the mean-field slave-boson method.

  11. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  12. Coherent state approach to the interacting boson model: Test of its validity in the transitional region

    SciTech Connect

    Inci, I.; Alonso, C. E.; Arias, J. M.; Fortunato, L.; Vitturi, A.

    2009-09-15

    The predictive power of the coherent state (CS) approach to the interacting boson model (IBM) is tested far from the IBM dynamical symmetry limits. The transitional region along the {gamma}-unstable path from U(5) to O(6) is considered. Excitation energy of the excited {beta} band and intraband and interband transitions obtained within the CS approach are compared with the exact results as a function of the boson number N. We find that the CS formalism provides approximations to the exact results that are correct up to the order 1/N in the transitional region, except in a narrow region close to the critical point.

  13. Boson dominance in nuclei

    SciTech Connect

    Palumbo, Fabrizio

    2005-07-01

    We present a new method of bosonization of fermion systems applicable when the partition function is dominated by composite bosons. By restricting the partition function to such states, we obtain a Euclidean bosonic action from which we derive the Hamiltonian. Such a procedure respects all the fermion symmetries, particularly the fermion number conservation, and provides a boson mapping of all fermion operators.

  14. On the bosonic end perturbative approaches to the study of anyons

    NASA Astrophysics Data System (ADS)

    Amelino-Camelia, G.

    1992-07-01

    I examine, in the case of two anyons in a common harmonic well, the validity of the perturbative approaches to the study of ``quasi-bosonic'' anyons which are discussed in the literature. Supported in part by funds provided by the ``Fondazioni Angelo Della Riccia'', Florence, Italy.

  15. Usage-Oriented Topic Maps Building Approach

    NASA Astrophysics Data System (ADS)

    Ellouze, Nebrasse; Lammari, Nadira; Métais, Elisabeth; Ben Ahmed, Mohamed

    In this paper, we present a collaborative and incremental construction approach of multilingual Topic Maps based on enrichment and merging techniques. In recent years, several Topic Map building approaches have been proposed endowed with different characteristics. Generally, they are dedicated to particular data types like text, semi-structured data, relational data, etc. We note also that most of these approaches take as input monolingual documents to build the Topic Map. The problem is that the large majority of resources available today are written in various languages, and these resources could be relevant even to non-native speakers. Thus, our work is driven towards a collaborative and incremental method for Topic Map construction from textual documents available in different languages. To enrich the Topic Map, we take as input a domain thesaurus and we propose also to explore the Topic Map usage which means available potential questions related to the source documents.

  16. The New Approach for Earhtquake Hazard Mapping

    NASA Astrophysics Data System (ADS)

    Handayani, B.; Karnawati, D.; Anderson, R.

    2008-05-01

    It is the fact the hazard map, such as Earthquake Hazard Map, may not always effectively implemented in the mitigation effort. All of the hazard maps are technical maps which is not always easy to be understood and followed by the community living in the vulnerable areas. Therefore, some effots must be done to guarantee the effectiveness of hazard map. This paper will discuss about the approach and method for developing more appropriate earthquake hazard map in Bantul Regency, Yogyakarta, Indonesia. Psychological mapping to identify levels and distributions of community trauma is proposed as the early reference for earhquake hazard mapping. By referring to this trauma zonation and combining with the seismicity and geological mapping, the earthquake hazard mapping can be established. It is also interesting that this approach is not only providing more appropriate hazard map, but also stimulating the community empowerement in the earthquake vulnerable areas. Several training for improving community awareness are also conducted as a part of the mapping process.

  17. Reprint of : Scattering theory approach to bosonization of non-equilibrium mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Sukhorukov, Eugene V.

    2016-08-01

    Between many prominent contributions of Markus Büttiker to mesoscopic physics, the scattering theory approach to the electron transport and noise stands out for its elegance, simplicity, universality, and popularity between theorists working in this field. It offers an efficient way to theoretically investigate open electron systems far from equilibrium. However, this method is limited to situations where interactions between electrons can be ignored, or considered perturbatively. Fortunately, this is the case in a broad class of metallic systems, which are commonly described by the Fermi liquid theory. Yet, there exist another broad class of electron systems of reduced dimensionality, the so-called Tomonaga-Luttinger liquids, where interactions are effectively strong and cannot be neglected even at low energies. Nevertheless, strong interactions can be accounted exactly using the bosonization technique, which utilizes the free-bosonic character of collective excitations in these systems. In the present work, we use this fact in order to develop the scattering theory approach to the bosonization of open quasi-one dimensional electron systems far from equilibrium.

  18. Supersymmetric Ito equation: Bosonization and exact solutions

    SciTech Connect

    Ren Bo; Yu Jun; Lin Ji

    2013-04-15

    Based on the bosonization approach, the N=1 supersymmetric Ito (sIto) system is changed to a system of coupled bosonic equations. The approach can effectively avoid difficulties caused by intractable fermionic fields which are anticommuting. By solving the coupled bosonic equations, the traveling wave solutions of the sIto system are obtained with the mapping and deformation method. Some novel types of exact solutions for the supersymmetric system are constructed with the solutions and symmetries of the usual Ito equation. In the meanwhile, the similarity reduction solutions of the model are also studied with the Lie point symmetry theory.

  19. Supersymmetric Ito equation: Bosonization and exact solutions

    NASA Astrophysics Data System (ADS)

    Ren, Bo; Lin, Ji; Yu, Jun

    2013-04-01

    Based on the bosonization approach, the N =1 N = 1 supersymmetric Ito (sIto) system is changed to a system of coupled bosonic equations. The approach can effectively avoid difficulties caused by intractable fermionic fields which are anticommuting. By solving the coupled bosonic equations, the traveling wave solutions of the sIto system are obtained with the mapping and deformation method. Some novel types of exact solutions for the supersymmetric system are constructed with the solutions and symmetries of the usual Ito equation. In the meanwhile, the similarity reduction solutions of the model are also studied with the Lie point symmetry theory.

  20. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  1. Non-equilibrium slave bosons approach to quantum pumping in interacting quantum dots

    NASA Astrophysics Data System (ADS)

    Citro, Roberta; Romeo, Francesco

    2016-03-01

    We review a time-dependent slave bosons approach within the non-equilibrium Green's function technique to analyze the charge and spin pumping in a strongly interacting quantum dot. We study the pumped current as a function of the pumping phase and of the dot energy level and show that a parasitic current arises, beyond the pure pumping one, as an effect of the dynamical constraints. We finally illustrate an all-electrical mean for spin-pumping and discuss its relevance for spintronics applications.

  2. Analytical approach to a bosonic ladder subject to a magnetic field

    NASA Astrophysics Data System (ADS)

    Uchino, Shun

    2016-05-01

    We examine a bosonic two-leg ladder model subject to a magnetic flux and especially focus on a regime where the lower-energy band has two minima. By using a low-energy field theory approach, we study several issues discussed in the system: the existence of local patterns in density and current, chiral-current reversal, and the effect of a nearest-neighbor interaction along the rung direction. In our formalism, the local patterns are interpreted as a result of breaking of discrete symmetry. The chiral-current reversal occurs through a competition between a current component determined at a commensurate vortex density causing an enlargement of the unit cell and another component, which is proportional to the magnetic-field doping from the corresponding commensurate flux. The nearest-neighbor interaction along the rung direction available with the technique on a synthetic dimension is shown to favor a population-imbalance solution in an experimentally relevant regime.

  3. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  4. Microscopic calculation of interacting boson model parameters by potential-energy surface mapping

    SciTech Connect

    Bentley, I.; Frauendorf, S.

    2011-06-15

    A coherent state technique is used to generate an interacting boson model (IBM) Hamiltonian energy surface which is adjusted to match a mean-field energy surface. This technique allows the calculation of IBM Hamiltonian parameters, prediction of properties of low-lying collective states, as well as the generation of probability distributions of various shapes in the ground state of transitional nuclei, the last two of which are of astrophysical interest. The results for krypton, molybdenum, palladium, cadmium, gadolinium, dysprosium, and erbium nuclei are compared with experiment.

  5. "True enough" formulations: the MAPS approach.

    PubMed

    Goldman, Stuart

    2012-01-01

    Clinical case formulation is at the core of competent care. When appropriately constructed it is grounded in best practices and serves as an explanatory model, a prescriptive road map, and a yardstick for all interventions. Despite the key role of formulations, many clinicians struggle with their construction and usage. The author offers a new model described as the MAPS approach. This framework, which is pragmatic, driven by clinical data, and process oriented, helps clinicians develop a "true enough" core formulation focusing on the most salient clinical elements that must be addressed. Its graphic nature helps reinforce the interrelated systems nature of psychiatric work and directs the clinician to a restricted number of specific areas that both inform the "core formulation" and serve as the targets for care. This comprehensive model, which includes evaluation, formulation, treatment planning, and treatment monitoring, readily complements and dovetails with the full range of treatment approaches.

  6. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  7. Simple condensation of composite bosons in a number conserving approach to many fermion-systems

    SciTech Connect

    Palumbo, Fabrizio

    2009-10-15

    We recently proposed an exact bosonization procedure which generates a Hamiltonian of composite bosons interacting among themselves and with fermionic quasiparticles. The interaction among composites whose mixing is allowed by symmetries is strong, but a simple condensate cannot have a significant mixing with other composites. We determine the conditions of decoupling, study their effects and compare the results with the Random Phase Approximation and the BCS theory.

  8. Hydrochromic Approaches to Mapping Human Sweat Pores.

    PubMed

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed. PMID:27159417

  9. Approaching the two-dimensional dirty boson problem with n-leg ladders

    NASA Astrophysics Data System (ADS)

    Carrasquilla, Juan; Becca, Federico; Fabrizio, Michele

    2010-03-01

    We provide insight on the two-dimensional dirty boson problem by studying the disordered Bose Hubbard Model on n-leg ladders. We use Green's Function Monte Carlo and Variational Monte Carlo to establish the nature of the superfluid-insulator transition when the number of bosons equals the number of sites. Our numerical data is consistent with an intervening Bose Glass phase between the superfluid and Mott insulator phases, as recently suggested by Pollet and coworkers. Our data are useful to understand the difficulties observed in direct numerical and experimental determinations of the phase diagram of such systems.

  10. Learning topological maps: An alternative approach

    SciTech Connect

    Buecken, A.; Thrun, S.

    1996-12-31

    Our goal is autonomous real-time control of a mobile robot. In this paper we want to show a possibility to learn topological maps of a large-scale indoor environment autonomously. In the literature there are two paradigms how to store information on the environment of a robot: as a grid-based (geometric) or as a topological map. While grid-based maps are considerably easy to learn and maintain, topological maps are quite compact and facilitate fast motion-planning.

  11. Quantitative Genetic Interaction Mapping Using the E-MAP Approach

    PubMed Central

    Collins, Sean R.; Roguev, Assen; Krogan, Nevan J.

    2010-01-01

    Genetic interactions represent the degree to which the presence of one mutation modulates the phenotype of a second mutation. In recent years, approaches for measuring genetic interactions systematically and quantitatively have proven to be effective tools for unbiased characterization of gene function and have provided valuable data for analyses of evolution. Here, we present protocols for systematic measurement of genetic interactions with respect to organismal growth rate for two yeast species. PMID:20946812

  12. Evolution of biomedical ontologies and mappings: Overview of recent approaches.

    PubMed

    Groß, Anika; Pruski, Cédric; Rahm, Erhard

    2016-01-01

    Biomedical ontologies are heavily used to annotate data, and different ontologies are often interlinked by ontology mappings. These ontology-based mappings and annotations are used in many applications and analysis tasks. Since biomedical ontologies are continuously updated dependent artifacts can become outdated and need to undergo evolution as well. Hence there is a need for largely automated approaches to keep ontology-based mappings up-to-date in the presence of evolving ontologies. In this article, we survey current approaches and novel directions in the context of ontology and mapping evolution. We will discuss requirements for mapping adaptation and provide a comprehensive overview on existing approaches. We will further identify open challenges and outline ideas for future developments. PMID:27642503

  13. Combinatorial approach to generalized Bell and Stirling numbers and boson normal ordering problem

    SciTech Connect

    Mendez, M.A.; Blasiak, P.; Penson, K.A.

    2005-08-01

    We consider the numbers arising in the problem of normal ordering of expressions in boson creation a{sup {dagger}} and annihilation a operators ([a,a{sup {dagger}}]=1). We treat a general form of a boson string (a{sup {dagger}}){sup r{sub n}}a{sup s{sub n}}...(a{sup {dagger}}){sup r{sub 2}}a{sup s{sub 2}}(a{sup {dagger}}){sup r{sub 1}}a{sup s{sub 1}} which is shown to be associated with generalizations of Stirling and Bell numbers. The recurrence relations and closed-form expressions (Dobinski-type formulas) are obtained for these quantities by both algebraic and combinatorial methods. By extensive use of methods of combinatorial analysis we prove the equivalence of the aforementioned problem to the enumeration of special families of graphs. This link provides a combinatorial interpretation of the numbers arising in this normal ordering problem.

  14. Recent developments in MAP - MODULAR APPROACH to PHYSICS

    NASA Astrophysics Data System (ADS)

    Rae, Jennifer; Austen, Dave; Brouwer, Wytze

    2002-05-01

    We present recent developments in MAP - MODULAR APPROACH to PHYSICS - JAVA enhanced modules to be used as aids in teaching the first 3 terms of university physics. The MAP project is very comprehensive and consists of a modular approach to physics that utilizes JAVA applets, FLASH animations and HTML based tutorials. The overall instructional philosophy of MAP is constructivist and the project emphasizes active learner participation. In this talk we will provide a quick overview of the project and the results of recent pilot testing at several Canadian universities. It will also include a discussion of the VIDEO LAB aspect of MAP. This is a component that is integrated into MAP and permits students to capture and evaluate otherwise difficult to study phenomena on video.

  15. A contact map matching approach to protein structure similarity analysis.

    PubMed

    de Melo, Raquel C; Lopes, Carlos Eduardo R; Fernandes, Fernando A; da Silveira, Carlos Henrique; Santoro, Marcelo M; Carceroni, Rodrigo L; Meira, Wagner; Araújo, Arnaldo de A

    2006-01-01

    We modeled the problem of identifying how close two proteins are structurally by measuring the dissimilarity of their contact maps. These contact maps are colored images, in which the chromatic information encodes the chemical nature of the contacts. We studied two conceptually distinct image-processing algorithms to measure the dissimilarity between these contact maps; one was a content-based image retrieval method, and the other was based on image registration. In experiments with contact maps constructed from the protein data bank, our approach was able to identify, with greater than 80% precision, instances of monomers of apolipoproteins, globins, plastocyanins, retinol binding proteins and thioredoxins, among the monomers of Protein Data Bank Select. The image registration approach was only slightly more accurate than the content-based image retrieval approach. PMID:16819709

  16. Exact results in a slave boson saddle point approach for a strongly correlated electron model

    SciTech Connect

    Fresard, Raymond; Kopp, Thilo

    2008-08-15

    We revisit the Kotliar-Ruckenstein (KR) slave boson saddle point evaluation for a two-site correlated electron model. As the model can be solved analytically, it is possible to compare the KR saddle point results with the exact many-particle levels. The considered two-site cluster mimics an infinite-U single-impurity Anderson model with a nearest-neighbor Coulomb interaction: one site is strongly correlated with an infinite local Coulomb repulsion, which hybridizes with the second site, on which the local Coulomb repulsion vanishes. Making use of the flexibility of the representation, we introduce appropriate weight factors in the KR saddle point scheme. Ground-state and all excitation levels agree with the exact diagonalization results. Thermodynamics and correlation functions may be recovered in a suitably renormalized saddle point evaluation.

  17. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  18. Experimental scattershot boson sampling.

    PubMed

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J; Galvão, Ernesto F; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-04-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy.

  19. An Incremental Map Building Approach via Static Stixel Integration

    NASA Astrophysics Data System (ADS)

    Muffert, M.; Anzt, S.; Franke, U.

    2013-10-01

    This paper presents a stereo-vision based incremental mapping approach for urban regions. As input, we use the 3D representation called multi-layered Stixel World which is computed from dense disparity images. More and more, researchers of Driver Assistance Systems rely on efficient and compact 3D representations like the Stixel World. The developed mapping approach takes into account the motion state of obstacles, as well as free space information obtained from the Stixel World. The presented work is based on the well known occupancy grid mapping technique and is formulated with evidential theory. A detailed sensor model is described which is used to determine the information whether a grid cell is occupied, free or has an unknown state. The map update is solved in a time recursive manner by using the Dempster`s Rule of Combination. 3D results of complex inner city regions are shown and are compared with Google Earth images.

  20. A Nonparametric Approach for Mapping Quantitative Trait Loci

    PubMed Central

    Kruglyak, L.; Lander, E. S.

    1995-01-01

    Genetic mapping of quantitative trait loci (QTLs) is performed typically by using a parametric approach, based on the assumption that the phenotype follows a normal distribution. Many traits of interest, however, are not normally distributed. In this paper, we present a nonparametric approach to QTL mapping applicable to any phenotypic distribution. The method is based on a statistic Z(w), which generalizes the nonparametric Wilcoxon rank-sum test to the situation of whole-genome search by interval mapping. We determine the appropriate significance level for the statistic Z(w), by showing that its asymptotic null distribution follows an Ornstein-Uhlenbeck process. These results provide a robust, distribution-free method for mapping QTLs. PMID:7768449

  1. Look before you leap: a new approach to mapping QTL.

    PubMed

    Huang, B Emma; George, Andrew W

    2009-09-01

    In this paper, we present an innovative and powerful approach for mapping quantitative trait loci (QTL) in experimental populations. This deviates from the traditional approach of (composite) interval mapping which uses a QTL profile to simultaneously determine the number and location of QTL. Instead, we look before we leap by employing separate detection and localization stages. In the detection stage, we use an iterative variable selection process coupled with permutation to identify the number and synteny of QTL. In the localization stage, we position the detected QTL through a series of one-dimensional interval mapping scans. Results from a detailed simulation study and real analysis of wheat data are presented. We achieve impressive increases in the power of QTL detection compared to composite interval mapping. We also accurately estimate the size and position of QTL. An R library, DLMap, implements the methods described here and is freely available from CRAN ( http://cran.r-project.org/ ). PMID:19585099

  2. Tank Update System: A novel asset mapping approach for verifying and updating lakes using Google Maps

    NASA Astrophysics Data System (ADS)

    Reddy Pulsani, Bhaskar

    2016-06-01

    Mission Kakatiya is one of prestigious programs of Telangana state government under which restoration of tank across ten districts is being implemented. As part of the program, government plans to restore about 9,000 lakes. Therefore, to have a comprehensive list of lakes existing in Telangana state, Samagra Tank Survey was carried out. Data collected in this survey contained about 45,000 tanks. Since the mode of collection of data was not in a standard format and was made using excel, a web interface was created to fill the gaps and to standardise the data. A new approach for spatially identifying the lakes through Google maps was successfully implemented by developing a web interface. This approach is less common since it implements the nature of asset mapping for the lakes of Telangana state and shows the advantages of using online mapping applications such as Google maps in identifying and cross checking already existing lakes on it.

  3. Mapping between the classical and pseudoclassical models of a relativistic spinning particle in external bosonic and fermionic fields. II

    NASA Astrophysics Data System (ADS)

    Markov, Yu. A.; Markova, M. A.

    2016-06-01

    The exact solution of a system of bilinear identities derived in the first part of our work [1] for the case of real Grassmann-odd tensor aggregate of the type (S ,Vμ ,*Tμν ,Aμ , P) is obtained. The consistency of the solution with a corresponding system of bilinear identities including both the tensor variables and their derivatives (S ˙ ,V˙μ ,*T˙μν ,A˙μ , P ˙) is considered. The alternative approach in solving of the algebraic system based on introducing complex tensor quantities is discussed. This solution is used in constructing the mapping of the interaction terms of spinning particle with a background (Majorana) fermion field ΨMαi (x). A way of the extension of the obtained results for the case of the Dirac spinors (ψDα ,θDα) and a background Dirac field ΨDαi (x), is suggested. It is shown that for the construction of one-to-one correspondence between the most general spinors and the tensor variables, we need a four-fold increase of the number of the tensor ones. A connection with the higher-order derivative Lagrangians for a point particle and in particular, with the Lagrangian suggested by A.M. Polyakov, is proposed.

  4. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  5. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as 'highly likely corn,' 'likely corn' or 'unlikely corn.' To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data. ?? 2003 Elsevier B.V. All rights reserved.

  6. Noise pollution mapping approach and accuracy on landscape scales.

    PubMed

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations.

  7. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  8. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches.

    PubMed

    Igoshev, P A; Timirgazin, M A; Gilmutdinov, V F; Arzhnikov, A K; Irkhin, V Yu

    2015-11-11

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed.

  9. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches.

    PubMed

    Igoshev, P A; Timirgazin, M A; Gilmutdinov, V F; Arzhnikov, A K; Irkhin, V Yu

    2015-11-11

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed. PMID:26465091

  10. Landau's quasiparticle mapping: Fermi liquid approach and Luttinger liquid behavior.

    PubMed

    Heidbrink, Caspar P; Uhrig, Götz S

    2002-04-01

    A continuous unitary transformation is introduced which realizes Landau's mapping of the elementary excitations (quasiparticles) of an interacting Fermi liquid system to those of the system without interaction. The conservation of the number of quasiparticles is important. The transformation is performed numerically for a one-dimensional system, i.e., the worst case for a Fermi liquid approach. Yet evidence for Luttinger liquid behavior is found. Such an approach may open a route to a unified description of Fermi and Luttinger liquids on all energy scales.

  11. Wildfire susceptibility mapping: comparing deterministic and stochastic approaches

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Leuenberger, Michael; Parente, Joana; Tonini, Marj

    2016-04-01

    Estimating the probability of wildfire-occurrence in a certain area under particular environmental conditions represents a modern tool to support forest protection plans and to reduce fires consequences. This can be performed by the implementation of wildfire susceptibility mapping, normally achieved employing more or less sophisticated models which combine the predisposing variables (as raster datasets) into a geographic information systems (GIS). The selection of the appropriate variables includes the evaluation of success and the implementation of prediction curves, as well as independent probabilistic validations for different scenarios. These methods allow to define the spatial pattern of wildfire-occurrences, characterize the susceptibility of the territory, namely for specific fire causes/types, and can also account for other factors such as human behavior and social aspects. We selected Portugal as the study region which, due to its favorable climatic, topographic and vegetation conditions, is by far the European country most affected by wildfires. In addition, Verde and Zêzere (2010) performed a first assessment and validation of wildfire susceptibility and hazard in Portugal which can be used as benchmarking. The objectives of the present study comprise: (1) assessing the structural forest fire risk in Portugal using updated datasets, namely, with higher spatial resolution (80 m to 25 m), most recent vegetation cover (Corine Land Cover), longer fire history (1975-2013); and, (2) comparing linear vs non-linear approaches for wildfire susceptibility mapping. The data we used includes: (i) a DEM derived from the Shuttle Radar Topographic Mission in a resolution of 1 arc-seconds (DEM-SRTM 25 m) to assess elevation and slope; (ii) the Corine Land Cover inventory provided by the European Environment Agency (http://www.eea.europa.eu/pt) to produce the land use land cover map; (iii) the National Mapping Burnt Areas (NMBA) provided by the Institute for the

  12. Phase diagram of ultracold atoms in optical lattices: Comparative study of slave fermion and slave boson approaches to Bose-Hubbard model

    SciTech Connect

    Yu Yue; Chui, S. T.

    2005-03-01

    We perform a comparative study of the finite temperature behavior of ultracold Bose atoms in optical lattices by the slave fermion and the slave boson approaches to the Bose-Hubbard model. The phase diagram of the system is presented. Although both approaches are equivalent without approximations, the mean field theory based on the slave fermion technique is quantitatively more appropriate. Conceptually, the slave fermion approach automatically excludes the double occupancy of two identical fermions on the same lattice site. By comparing to known results in limiting cases, we find the slave fermion approach better than the slave boson approach. For example, in the non-interacting limit, the critical temperature of the superfluid-normal liquid transition calculated by the slave fermion approach is closer to the well-known ideal Bose gas result. At zero-temperature limit of the critical interaction, strength from the slave fermion approach is also closer to that from the direct calculation using a zero-temperature mean field theory.

  13. Condensation of N bosons: Microscopic approach to fluctuations in an interacting Bose gas

    SciTech Connect

    Svidzinsky, Anatoly A.; Scully, Marlan O.

    2010-12-15

    We present a microscopic derivation of the master equation for the condensate density matrix for an interacting Bogoliubov-Bose gas of N atoms. We choose the interaction Hamiltonian in a special way that substantially simplifies the master equation, yielding no coupling between diagonal and off-diagonal terms. The present formulation allows us to solve the problem analytically in a steady state and obtain the expression for the distribution function and equilibrium condensate fluctuations. For the first two central moments, our results are equivalent to those obtained in the canonical-ensemble quasiparticle formalism [V. V. Kocharovsky, Vl. V. Kocharovsky, and M. O. Scully, Phys. Rev. Lett. 84, 2306 (2000); Phys. Rev. A 61, 053606 (2000)], in the low-temperature range where these papers are valid, but also give an accurate description at high temperatures. The present analysis for an interacting Bose gas is as accurate as the master equation approach of Kocharovsky et al.[Phys. Rev. A 61, 023609 (2000)] is for an ideal gas.

  14. A covariance fitting approach for correlated acoustic source mapping.

    PubMed

    Yardibi, Tarik; Li, Jian; Stoica, Petre; Zawodny, Nikolas S; Cattafesta, Louis N

    2010-05-01

    Microphone arrays are commonly used for noise source localization and power estimation in aeroacoustic measurements. The delay-and-sum (DAS) beamformer, which is the most widely used beamforming algorithm in practice, suffers from low resolution and high sidelobe level problems. Therefore, deconvolution approaches, such as the deconvolution approach for the mapping of acoustic sources (DAMAS), are often used for extracting the actual source powers from the contaminated DAS results. However, most deconvolution approaches assume that the sources are uncorrelated. Although deconvolution algorithms that can deal with correlated sources, such as DAMAS for correlated sources, do exist, these algorithms are computationally impractical even for small scanning grid sizes. This paper presents a covariance fitting approach for the mapping of acoustic correlated sources (MACS), which can work with uncorrelated, partially correlated or even coherent sources with a reasonably low computational complexity. MACS minimizes a quadratic cost function in a cyclic manner by making use of convex optimization and sparsity, and is guaranteed to converge at least locally. Simulations and experimental data acquired at the University of Florida Aeroacoustic Flow Facility with a 63-element logarithmic spiral microphone array in the absence of flow are used to demonstrate the performance of MACS. PMID:21117743

  15. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  16. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  17. Topological hardcore bosons on the honeycomb lattice

    NASA Astrophysics Data System (ADS)

    Owerre, S. A.

    2016-09-01

    This paper presents a connection between the topological properties of hardcore bosons and that of magnons in quantum spin magnets. We utilize the Haldane-like hardcore bosons on the honeycomb lattice as an example. We show that this system maps to a spin-$1/2$ quantum XY model with a next-nearest-neighbour Dzyaloshinsky-Moriya interaction. We obtain the magnon excitations of the quantum spin model and compute the edge states, Berry curvature, thermal and spin Nernst conductivities. Due to the mapping from spin variables to bosons, the hardcore bosons possess the same nontrivial topological properties as those in quantum spin system. These results are important in the study of magnetic excitations in quantum magnets and they are also useful for understanding the control of ultracold bosonic quantum gases in honeycomb optical lattices, which is experimentally accessible.

  18. Canonical map approach to channeling stability in crystals. II

    NASA Astrophysics Data System (ADS)

    Sáenz, A. W.

    1987-11-01

    A nonrelativistic and a relativistic classical Hamiltonian model of two degrees of freedom are considered describing the plane motion of a particle in a potential V(x1,x2)[(x1,x2) =Cartesian coordinates]. Suppose V(x1,x2) is real analytic in its arguments in a neighborhood of the line x2=0, one-periodic in x1 there, and such that the average value of ∂V(x1,0)/∂x2 vanishes. It is proved that, under these conditions and provided that the particle energy E is sufficiently large, there exist for all time two distinguished solutions, one satisfying the equations of motion of the nonrelativistic model and the other those of the relativistic model, whose corresponding configuration-space orbits are one-periodic in x1 and approach the line x2=0 as E→∞. The main theorem is that these solutions are (future) orbitally stable at large enough E if V satisfies the above conditions, as well as natural requirements of linear and nonlinear stability. To prove their existence, one uses a well-known theorem, for which a new and simpler proof is provided, and properties of certain natural canonical maps appropriate to these respective models. It is shown that such solutions are orbitally stable by reducing the maps in question to Birkhoff canonical form and then applying a version of the Moser twist theorem. The approach used here greatly lightens the labor of deriving key estimates for the above maps, these estimates being needed to effect this reduction. The present stability theorem is physically interesting because it is the first rigorous statement on the orbital stability of certain channeling motions of fast charged particles in rigid two-dimensional lattices, within the context of models of the stated degree of generality.

  19. Higgs Boson Physics at Atlas

    NASA Astrophysics Data System (ADS)

    Denis, Richard St.

    2015-03-01

    The discovery of a new boson with the ATLAS detector at the LHC proton-proton collider is confirmed using the full data set collected at centre-of-mass energies of 7 and 8 TeV. The spin and parity properties of the boson are consistent with that of a scalar particle with positive parity. Comparison of the JP = 0+ hypothesis to alternatives JP = 0-, 1+, 1-, 2+ result in exclusion of these other choices at 97.8%, 99.97%, 99.7%, and 99.3% CL. The Higgs boson mass is m_H = 125.5 pm 0.2l( {stat. right)_{ - 0.5}^{ + 0.5} l( {syst.} right)GeV. Evidence for production of the Higgs boson by vector boson fusion is obtained in a model-independent approach by comparing the signal strengths μ of vector boson fusion and production associated with a vector boson to that for gluon fusion including associated production of top quark pairs: μ _{VBF + VH /μ _{gg F + ttH = 1.4_{ - 0.3}^{ + 0.4} l( {stat.} right)_{ - 0.4}^{ + 0.6} l( {syst.} right) which is 3.3 Gaussian standard deviations from zero.

  20. Einstein's Gravitational Field Approach to Dark Matter and Dark Energy-Geometric Particle Decay into the Vacuum Energy Generating Higgs Boson and Heavy Quark Mass

    NASA Astrophysics Data System (ADS)

    Christensen, Walter James

    2015-08-01

    During an interview at the Niels Bohr Institute David Bohm stated, "according to Einstein, particles should eventually emerge as singularities, or very strong regions of stable pulses of (the gravitational) field" [1]. Starting from this premise, we show spacetime, indeed, manifests stable pulses (n-valued gravitons) that decay into the vacuum energy to generate all three boson masses (including Higgs), as well as heavy-quark mass; and all in precise agreement with the 2010 CODATA report on fundamental constants. Furthermore, our relativized quantum physics approach (RQP) answers to the mystery surrounding dark energy, dark matter, accelerated spacetime, and why ordinary matter dominates over antimatter.

  1. Symmetry-improved 2PI approach to the Goldstone-boson IR problem of the SM effective potential

    NASA Astrophysics Data System (ADS)

    Pilaftsis, Apostolos; Teresi, Daniele

    2016-05-01

    The effective potential of the Standard Model (SM), from three loop order and higher, suffers from infrared (IR) divergences arising from quantum effects due to massless would-be Goldstone bosons associated with the longitudinal polarizations of the W± and Z bosons. Such IR pathologies also hinder accurate evaluation of the two-loop threshold corrections to electroweak quantities, such as the vacuum expectation value of the Higgs field. However, these divergences are an artifact of perturbation theory, and therefore need to be consistently resummed in order to obtain an IR-safe effective potential. The so-called Two-Particle-Irreducible (2PI) effective action provides a rigorous framework to consistently perform such resummations, without the need to resort to ad hoc subtractions or running into the risk of over-counting contributions. By considering the recently proposed symmetry-improved 2PI formalism, we address the problem of the Goldstone-boson IR divergences of the SM effective potential in the gaugeless limit of the theory. In the same limit, we evaluate the IR-safe symmetry-improved 2PI effective potential, after taking into account quantum loops of chiral fermions, as well as the renormalization of spurious custodially breaking effects triggered by fermionic Yukawa interactions. Finally, we compare our results with those obtained with other methods presented in the literature.

  2. Teaching Population Health: A Competency Map Approach to Education

    PubMed Central

    Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd

    2013-01-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  3. Current approaches to fine mapping of antigen–antibody interactions

    PubMed Central

    Abbott, W Mark; Damschroder, Melissa M; Lowe, David C

    2014-01-01

    A number of different methods are commonly used to map the fine details of the interaction between an antigen and an antibody. Undoubtedly the method that is now most commonly used to give details at the level of individual amino acids and atoms is X-ray crystallography. The feasibility of undertaking crystallographic studies has increased over recent years through the introduction of automation, miniaturization and high throughput processes. However, this still requires a high level of sophistication and expense and cannot be used when the antigen is not amenable to crystallization. Nuclear magnetic resonance spectroscopy offers a similar level of detail to crystallography but the technical hurdles are even higher such that it is rarely used in this context. Mutagenesis of either antigen or antibody offers the potential to give information at the amino acid level but suffers from the uncertainty of not knowing whether an effect is direct or indirect due to an effect on the folding of a protein. Other methods such as hydrogen deuterium exchange coupled to mass spectrometry and the use of short peptides coupled with ELISA-based approaches tend to give mapping information over a peptide region rather than at the level of individual amino acids. It is quite common to use more than one method because of the limitations and even with a crystal structure it can be useful to use mutagenesis to tease apart the contribution of individual amino acids to binding affinity. PMID:24635566

  4. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.; Chen, Dongmei; Moulin, Bernard; Wu, Jianhong

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  5. A statistical approach for validating eSOTER and digital soil maps in front of traditional soil maps

    NASA Astrophysics Data System (ADS)

    Bock, Michael; Baritz, Rainer; Köthe, Rüdiger; Melms, Stephan; Günther, Susann

    2015-04-01

    During the European research project eSOTER, three different Digital Soil Maps (DSM) were developed for the pilot area Chemnitz 1:250,000 (FP7 eSOTER project, grant agreement nr. 211578). The core task of the project was to revise the SOTER method for the interpretation of soil and terrain data. It was one of the working hypothesis that eSOTER does not only provide terrain data with typical soil profiles, but that the new products actually perform like a conceptual soil map. The three eSOTER maps for the pilot area considerably differed in spatial representation and content of soil classes. In this study we compare the three eSOTER maps against existing reconnaissance soil maps keeping in mind that traditional soil maps have many subjective issues and intended bias regarding the overestimation and emphasize of certain features. Hence, a true validation of the proper representation of modeled soil maps is hardly possible; rather a statistical comparison between modeled and empirical approaches is possible. If eSOTER data represent conceptual soil maps, then different eSOTER, DSM and conventional maps from various sources and different regions could be harmonized towards consistent new data sets for large areas including the whole European continent. One of the eSOTER maps has been developed closely to the traditional SOTER method: terrain classification data (derived from SRTM DEM) were combined with lithology data (re-interpreted geological map); the corresponding terrain units were then extended with soil information: a very dense regional soil profile data set was used to define soil mapping units based on a statistical grouping of terrain units. The second map is a pure DSM map using continuous terrain parameters instead of terrain classification; radiospectrometric data were used to supplement parent material information from geology maps. The classification method Random Forest was used. The third approach predicts soil diagnostic properties based on

  6. Bosonization for beginners - refermionization for experts

    NASA Astrophysics Data System (ADS)

    von Delft, Jan; Schoeller, Herbert

    1998-11-01

    This tutorial review gives an elementary and self-contained derivation of the standard identities ((x) F e - i(x, etc.) for abelian bosonization in 1 dimension in a system of finite size L, following and simplifying Haldane's constructive approach. As a non-trivial application, we rigorously resolve (following Furusaki) a recent controversy regarding the tunneling density of states, ρdos(), at the site of an impurity in a Tomonaga-Luttinger liquid: we use finite-size refermionization to show exactly that for g = 1/2 its asymptotic low-energy behavior is ρdos() . This agrees with the results of Fabrizio & Gogolin and of Furusaki, but not with those of Oreg and Finkel'stein (probably because we capture effects not included in their mean-field treatment of the Coulomb gas that they obtained by an exact mapping; their treatment of anti-commutation relations in this mapping is correct, however, contrary to recent suggestions in the literature). - The tutorial is addressed to readers with little or no prior knowledge of bosonization, who are interested in seeing all the details explicitly; it is written at the level of beginning graduate students, requiring only knowledge of second quantization, but not of field theory (which is not needed here). At the same time, we hope that experts too might find useful our explicit treatment of certain subtleties that can often be swept under the rug, but are crucial for some applications, such as the calculation of ρdos() - these include the proper treatment of the so-called Klein factors that act as fermion-number ladder operators (and also ensure the anti-commutation of different species of fermion fields), the retention of terms of order 1/L, and a novel, rigorous formulation of finite-size refermionization of both F e - i(x) and the boson field (x) itself.

  7. Uncertainty propagation in a cascade modelling approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rincón, J. P.; Pedrozo-Acuña, A.; Breña Naranjo, J. A.

    2014-07-01

    The purpose of this investigation is to study the propagation of meteorological uncertainty within a cascade modelling approach to flood mapping. The methodology is comprised of a Numerical Weather Prediction Model (NWP), a distributed rainfall-runoff model and a standard 2-D hydrodynamic model. The cascade of models is used to reproduce an extreme flood event that took place in the Southeast of Mexico, during November 2009. The event is selected as high quality field data (e.g. rain gauges; discharge) and satellite imagery are available. Uncertainty in the meteorological model (Weather Research and Forecasting model) is evaluated through the use of a multi-physics ensemble technique, which considers twelve parameterization schemes to determine a given precipitation. The resulting precipitation fields are used as input in a distributed hydrological model, enabling the determination of different hydrographs associated to this event. Lastly, by means of a standard 2-D hydrodynamic model, hydrographs are used as forcing conditions to study the propagation of the meteorological uncertainty to an estimated flooded area. Results show the utility of the selected modelling approach to investigate error propagation within a cascade of models. Moreover, the error associated to the determination of the runoff, is showed to be lower than that obtained in the precipitation estimation suggesting that uncertainty do not necessarily increase within a model cascade.

  8. Pure P2P mediation system: A mappings discovery approach

    NASA Astrophysics Data System (ADS)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  9. Interacting boson model from energy density functionals: {gamma}-softness and the related topics

    SciTech Connect

    Nomura, K.

    2012-10-20

    A comprehensive way of deriving the Hamiltonian of the interacting boson model (IBM) is described. Based on the fact that the multi-nucleon induced surface deformation in finite nucleus is simulated by effective boson degrees of freedom, the potential energy surface calculated with self-consistent mean-field method employing a given energy density functional (EDF) is mapped onto the IBM analog, and thereby the excitation spectra and transition rates with good symmetry quantum numbers are calculated. Recent applications of the proposed approach are reported: (i) an alternative robust interpretation of the {gamma}-soft nuclei and (ii) shape coexistence in lead isotopes.

  10. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  11. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  12. Current Approaches Toward Quantitative Mapping of the Interactome

    PubMed Central

    Buntru, Alexander; Trepte, Philipp; Klockmeier, Konrad; Schnoegl, Sigrid; Wanker, Erich E.

    2016-01-01

    Protein–protein interactions (PPIs) play a key role in many, if not all, cellular processes. Disease is often caused by perturbation of PPIs, as recently indicated by studies of missense mutations. To understand the associations of proteins and to unravel the global picture of PPIs in the cell, different experimental detection techniques for PPIs have been established. Genetic and biochemical methods such as the yeast two-hybrid system or affinity purification-based approaches are well suited to high-throughput, proteome-wide screening and are mainly used to obtain qualitative results. However, they have been criticized for not reflecting the cellular situation or the dynamic nature of PPIs. In this review, we provide an overview of various genetic methods that go beyond qualitative detection and allow quantitative measuring of PPIs in mammalian cells, such as dual luminescence-based co-immunoprecipitation, Förster resonance energy transfer or luminescence-based mammalian interactome mapping with bait control. We discuss the strengths and weaknesses of different techniques and their potential applications in biomedical research. PMID:27200083

  13. A faster and economical approach to floodplain mapping using the SSURGO soil database

    NASA Astrophysics Data System (ADS)

    Sangwan, N.; Merwade, V.

    2014-12-01

    Floods are the most damaging of all natural disasters, adversely affecting millions of lives and causing financial losses worth billions of dollars every year across the globe. Flood inundation maps play a key role in the assessment and mitigation of potential flood hazards. However, there are several communities in the United States where flood risk maps are not available due to the lack of the resources needed to create such maps through the conventional modeling approach. The objective of this study is to develop and examine an economical alternative approach to floodplain mapping using widely available SSURGO soil data in the United States. By using the state of Indiana as a test case, floodplain maps are developed for the entire state by identifying the flood-prone soil map units based on their attributes recorded in the SSURGO database. For validation, the flood extents obtained from the soil data are compared with the extents predicted by other floodplain maps, including the Federal Emergency Management Agency (FEMA) issued Flood Insurance Rate Maps (FIRM), flood extents observed during past floods, and other flood maps derived using Digital Elevation Models (DEMs). In general, SSURGO based floodplain maps are found to be largely in agreement with flood inundation maps created by FEMA. Comparison between the FEMA maps and the SSURGO derived floodplain maps show an overlap ranging from 65 to 90 percent. Similar results are also found when the SSURGO derived floodplain maps are compared with FEMA maps for recent flood events in other states including Minnesota, Washington and Wisconsin. Although not in perfect conformance with reference flood maps, the SSURGO soil data approach offers an economical and faster alternative to floodplain mapping in areas where detailed flood modeling and mapping has not been conducted.

  14. Nonequilibrium Dynamical Mean-Field Theory for Bosonic Lattice Models

    NASA Astrophysics Data System (ADS)

    Strand, Hugo U. R.; Eckstein, Martin; Werner, Philipp

    2015-01-01

    We develop the nonequilibrium extension of bosonic dynamical mean-field theory and a Nambu real-time strong-coupling perturbative impurity solver. In contrast to Gutzwiller mean-field theory and strong-coupling perturbative approaches, nonequilibrium bosonic dynamical mean-field theory captures not only dynamical transitions but also damping and thermalization effects at finite temperature. We apply the formalism to quenches in the Bose-Hubbard model, starting from both the normal and the Bose-condensed phases. Depending on the parameter regime, one observes qualitatively different dynamical properties, such as rapid thermalization, trapping in metastable superfluid or normal states, as well as long-lived or strongly damped amplitude oscillations. We summarize our results in nonequilibrium "phase diagrams" that map out the different dynamical regimes.

  15. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  16. Study of hole pair condensation based on the SU(2) Slave-Boson approach to the t-J Hamiltonian: Temperature, momentum and doping dependences of spectral functions

    SciTech Connect

    Salk, S.H.S.; Lee, S.S.

    1999-11-01

    Based on the U(1) and SU(2) slave-boson approaches to the t-J Hamiltonian, the authors evaluate the one electron spectral functions for the hole doped high {Tc} cuprates for comparison with the angle resolved photoemission spectroscopy (ARPES) data. They find that the observed quasiparticle peak in the superconducting state is correlated with the hump which exists in the normal state. They find that the spectral weight of the quasiparticle peak increases as doping rate increases, which is consistent with observation. As a consequence of the phase fluctuation effects of the spinon and holon pairing order parameters the spectral weight of the predicted peak obtained from the SU(2) theory is found to be smaller than the one predicted from U(1) mean field theory.

  17. Decoherence of spin-deformed bosonic model

    SciTech Connect

    Dehdashti, Sh.; Mahdifar, A.; Bagheri Harouni, M.; Roknizadeh, R.

    2013-07-15

    The decoherence rate and some parameters affecting it are investigated for the generalized spin-boson model. We consider the spin-bosonic model when the bosonic environment is modeled by the deformed harmonic oscillators. We show that the state of the environment approaches a non-linear coherent state. Then, we obtain the decoherence rate of a two-level system which is in contact with a deformed bosonic environment which is either in thermal equilibrium or in the ground state. By using some recent realization of f-deformed oscillators, we show that some physical parameters strongly affect the decoherence rate of a two-level system. -- Highlights: •Decoherence of the generalized spin-boson model is considered. •In this model the environment consists of f-oscillators. •Via the interaction, the state of the environment approaches non-linear coherent states. •Effective parameters on decoherence are considered.

  18. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  19. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use

  20. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, M.; Buss, R.; Scherrer, S.; Margreth, M.; Zappa, M.

    2015-12-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexity of automatic DRP mapping approaches affects hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison and a deviation map were derived. The automatically derived DRP-maps were used in synthetic runoff simulations with an adapted version of the hydrological model PREVAH, and simulation results compared with those from simulations using the reference maps. The DRP-maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP-maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. We therefore recommend not only using expert

  1. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-11-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, objective mapping procedures based on quantitative cytoarchitecture have been generated. As a result, new maps for various species including man were established. In our contribution, principles of quantitative cytoarchitecture and algorithm-based cortical mapping are described for a cytoarchitectural parcellation of the human auditory cortex. Defining cortical borders based on quantified changes in cortical lamination is the decisive step towards a novel, highly improved probabilistic brain atlas.

  2. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    PubMed

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  3. Wormholes and Goldstone bosons

    SciTech Connect

    Lee, K.

    1988-07-18

    The quantum theory of a complex scalar field coupled to gravity is considered. A formalism for the semiclassical approach in Euclidean time is developed and used to study wormhole physics. The conserved global charge plays an essential role. Wormhole physics turns on only after the symmetry is spontaneously broken. An effective self-interaction for Goldstone bosons due to wormholes and child universes is shown to be a cosine potential, whose vacuum energy will be reduced by the cosmic expansion. Some implications and questions are discussed.

  4. Mapping

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1978-01-01

    Geologic mapping in the United States increased by about one-quarter in the past year. Examinations of mapping trends were in the following categories: (1) Mapping at scales of 1:100, 000; (2) Metric-scale base maps; (3) International mapping, and (4) Planetary mapping. (MA)

  5. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. PMID:22325583

  6. Transboundary aquifer mapping and management in Africa: a harmonised approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  7. Higgs boson at LHC: a diffractive opportunity

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2009-03-23

    An alternative process is presented for diffractive Higgs boson production in peripheral pp collisions, where the particles interact through the Double Pomeron Exchange. The event rate is computed as a central-rapidity distribution for Tevatron and LHC energies leading to a result around 0.6 pb, higher than the predictions from previous approaches. Therefore, this result arises as an enhanced signal for the detection of the Higgs boson in hadron colliders. The predictions for the Higgs boson photoproduction are compared to the ones obtained from a similar approach proposed by the Durham group, enabling an analysis of the future developments of its application to pp and AA collisions.

  8. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    PubMed

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one. PMID:25164506

  9. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    PubMed

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one.

  10. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  11. Composite Fermion Theory for Bosonic Quantum Hall States on Lattices

    NASA Astrophysics Data System (ADS)

    Möller, G.; Cooper, N. R.

    2009-09-01

    We study the ground states of the Bose-Hubbard model in a uniform magnetic field, motivated by the physics of cold atomic gases on lattices at high vortex density. Mapping the bosons to composite fermions (CF) leads to the prediction of quantum Hall fluids that have no counterpart in the continuum. We construct trial states for these phases and test numerically the predictions of the CF model. We establish the existence of strongly correlated phases beyond those in the continuum limit and provide evidence for a wider scope of the composite fermion approach beyond its application to the lowest Landau level.

  12. Eulerian Mapping Closure Approach for Probability Density Function of Concentration in Shear Flows

    NASA Technical Reports Server (NTRS)

    He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Eulerian mapping closure approach is developed for uncertainty propagation in computational fluid mechanics. The approach is used to study the Probability Density Function (PDF) for the concentration of species advected by a random shear flow. An analytical argument shows that fluctuation of the concentration field at one point in space is non-Gaussian and exhibits stretched exponential form. An Eulerian mapping approach provides an appropriate approximation to both convection and diffusion terms and leads to a closed mapping equation. The results obtained describe the evolution of the initial Gaussian field, which is in agreement with direct numerical simulations.

  13. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    NASA Astrophysics Data System (ADS)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  14. Approaches to Mapping Nitrogen Removal: Examples at a Landscape Scale

    EPA Science Inventory

    Wetlands can provide the ecosystem service of improved water quality via nitrogen removal, providing clean drinking water and reducing the eutrophication of aquatic resources. Within the ESRP, mapping nitrogen removal by wetlands is a service that incorporates the goals of the ni...

  15. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of sustainability…

  16. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  17. The Facebook influence model: a concept mapping approach.

    PubMed

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts.

  18. The Facebook Influence Model: A Concept Mapping Approach

    PubMed Central

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  19. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  20. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    DOE PAGESBeta

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combinationmore » of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.« less

  1. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.

  2. Generalizing geological maps with the GeoScaler software: The case study approach

    NASA Astrophysics Data System (ADS)

    Smirnoff, Alex; Huot-Vézina, Gabriel; Paradis, Serge J.; Boivin, Ruth

    2012-03-01

    Map generalization is rapidly becoming an important task in surficial and bedrock geology as broader regional and cross-boundary compilations are made from maps originally describing more specific areas. However, the entire process is still not defined in sufficient detail and relatively few automated tools are available. Moreover, the existing tools are primarily designed for generalization of topographic maps and do not address the needs specific to geology. Here we present two case studies describing our approach to the generalization of surficial and bedrock geology maps, respectively. To accomplish the task, we employed the GeoScaler software developed at the Laboratoire de cartographie numérique et de photogrammétrie (LCNP) of the Quebec division of the Geological Survey of Canada (Version 2009). The software is free over the Internet but requires an ArcGIS (ArcInfo) license. Four surficial geology maps at 1:250,000 scale were produced from 14 maps scaled at 1:100,000, while a single compilation of six bedrock maps was generalized from 1:125,000 to 1:500,000 scale. We describe the general considerations required to approach any generalization exercise, applied software, objectives, input data, major generalization steps, and the final results. All generalized maps were favorably evaluated by experts in geological mapping and the surficial maps have been published.

  3. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  4. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  5. Mapping baroreceptor function to genome: a mathematical modeling approach.

    PubMed Central

    Kendziorski, C M; Cowley, A W; Greene, A S; Salgado, H C; Jacob, H J; Tonellato, P J

    2002-01-01

    To gain information about the genetic basis of a complex disease such as hypertension, blood pressure averages are often obtained and used as phenotypes in genetic mapping studies. In contrast, direct measurements of physiological regulatory mechanisms are not often obtained, due in large part to the time and expense required. As a result, little information about the genetic basis of physiological controlling mechanisms is available. Such information is important for disease diagnosis and treatment. In this article, we use a mathematical model of blood pressure to derive phenotypes related to the baroreceptor reflex, a short-term controller of blood pressure. The phenotypes are then used in a quantitative trait loci (QTL) mapping study to identify a potential genetic basis of this controller. PMID:11973321

  6. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  7. Slave boson theories of correlated electron systems

    SciTech Connect

    Woelfle, P.

    1995-05-01

    Slave boson theories of various models of correlated fermions are critically reviewed and several new results are presented. In the example of the Anderson impurity model the limitations of slave boson mean field theory are discussed. Self-consistent conserving approximations are compared with results obtained from the numerical renormalization group. The gauge field theory of the t-J-model is considered in the quasistatic approximation. It is shown that weak localization effects can give valuable information on the existence of gauge fields. Applications of the slave-boson approach due to Kotliar and Ruckenstein to the Hubbard model are also discussed.

  8. Stationkeeping Approach for the Microwave Anisotropy Probe (MAP)

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, Dave; Schiff, Conrad

    2002-01-01

    The Microwave Anisotropy Probe was successfully launched on June 30, 2001 and placed into a Lissajous orbit about the L2 Sun-Earth-Moon libration point. However, the L2 libration point is unstable which necessitates occasional stationkeeping maneuvers in order to maintain the spacecraft s Lissajous orbit. Analyses were performed in order to develop a feasible L2 stationkeeping strategy for the MAP mission. The resulting strategy meets the allotted fuel budget, allowing for enough fuel to handle additional he1 taxes, while meeting the attitude requirements for the maneuvers. Results from the first two stationkeeping maneuvers are included.

  9. Ray mapping approach for the efficient design of continuous freeform surfaces.

    PubMed

    Bösel, Christoph; Gross, Herbert

    2016-06-27

    The efficient design of continuous freeform surfaces, which maps a given light source to an arbitrary target illumination pattern, remains a challenging problem and is considered here for collimated input beams. A common approach are ray-mapping methods, where first a ray mapping between the source and the irradiance distribution on the target plane is calculated and in a subsequent step the surface is constructed. The challenging aspect of this approach is to find an integrable mapping ensuring a continuous surface. Based on the law of reflection/refraction and an integrability condition, we derive a general condition for the surface and ray mapping for a collimated input beam. It is shown that in a small-angle approximation a proper mapping can be calculated via optimal mass transport - a mathematical framework for the calculation of a mapping between two positive density functions. We show that the surface can be constructed by solving a linear advection Eq. with appropriate boundary conditions. The results imply that the optimal mass transport mapping is approximately integrable over a wide range of distances between the freeform and the target plane and offer an efficient way to construct the surface by solving standard integrals. The efficiency is demonstrated by applying it to two challenging design examples, which shows the ability of the presented approach to handle target illumination patterns with steep irradiance gradients and numerous gray levels. PMID:27410583

  10. High-resolution habitat mapping on mud fields: new approach to quantitative mapping of Ocean quahog.

    PubMed

    Isachenko, Artem; Gubanova, Yana; Tzetlin, Alexander; Mokievsky, Vadim

    2014-12-01

    During 2009-2012 stocks of the bivalve Arctica islandica (Linnaeus, 1767) (Ocean quahog) in Kandalaksha Bay (the White Sea) has been assessed using a side-scan sonar, grab sampling and underwater photo imaging. Structurally uniform localities were highlighted on the basis of side-scan signal. Each type of a signal reflects combination of sediment type, microtopography and structural characteristics of benthic community. The distribution of A. islandica was the predominant factor in determining community structure. Seabed attributes considered most significant were defined for each type of substrate type. Relations of sonar signal and sediment type were used for landscape mapping based on sonar data. Community characteristics at known localities were reliably interpolated to the area of survey using statistical processing of geophysical data. A method of integrated sonar and sampling data interpretation for high-resolution mapping of A. islandica by biomass groups, benthic faunal groups and associated habitats was developed.

  11. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  12. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    SciTech Connect

    Hubert, R.S.; Chen, X.N.; Mitchell, S.

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  13. Boson representations of fermion systems: Proton-neutron systems

    SciTech Connect

    Sambataro, M.

    1988-05-01

    Applications of a procedure recently proposed to construct boson images of fermion Hamiltonians are shown for proton-neutron systems. First the mapping from SD fermion onto sd boson spaces is discussed and a Q/sub ..pi../xQ/sub ..nu../ interaction investigated. A Hermitian one-body Q boson operator is derived and analytical expressions for its coefficients are obtained. A (Q/sub ..pi../+Q/sub ..nu../)x(Q/sub ..pi../+Q/sub ..nu../) interaction is, then, studied for particle-hole systems and the connections with the SU/sup */(3) dynamical symmetry of the neutron-proton interacting boson model are discussed. Finally, an example of mapping from SDG onto sdg spaces is analyzed. Fermion spectra and E2 matrix elements are well reproduced in the boson spaces.

  14. Singlet Mott State Simulating the Bosonic Laughlin Wave Function

    NASA Astrophysics Data System (ADS)

    Lian, Biao; Zhang, Shou-Cheng

    2014-03-01

    We study properties of a class of spin singlet Mott states for arbitrary spin S bosons on a lattice, with particle number per cite n = S / l + 1 , where l is a positive integer. We show that such a singlet Mott state can be mapped to a bosonic Laughlin wave function on the sphere with a finite number of particles at filling ν = 1 / 2 l . Bosonic spinons, particle and hole excitations in the Mott state are discussed, among which the hole excitation can be mapped to the quasi-hole of the bosonic Laughlin wave function. We show that this singlet Mott state can be realized in a cold atom system on optical lattice, and can be identified using Bragg spectroscopy and Stern-Gerlach techniques. This class of singlet Mott states may be generalized to simulate bosonic Laughlin states with filling ν = q / 2 l .

  15. Anomalous gauge boson couplings

    SciTech Connect

    Barklow, T.; Rizzo, T.; Baur, U.

    1997-01-13

    The measurement of anomalous gauge boson self couplings is reviewed for a variety of present and planned accelerators. Sensitivities are compared for these accelerators using models based on the effective Lagrangian approach. The sensitivities described here are for measurement of {open_quotes}generic{close_quotes} parameters {kappa}{sub V}, {lambda}{sub V}, etc., defined in the text. Pre-LHC measurements will not probe these coupling parameters to precision better than O(10{sup -1}). The LHC should be sensitive to better than O(10{sup -2}), while a future NLC should achieve sensitivity of O(10{sup -3}) to O(10{sup -4}) for center of mass energies ranging from 0.5 to 1.5 TeV.

  16. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  17. Evaluating different mapping approaches of dominant runoff processes with similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2015-04-01

    The identification of landscape units with similar hydrologic response behaviour is crucial for runoff prediction in ungauged basins. An established method for catchment classification is based on the dominant runoff process (DRP) concept (Grayson & Blöschl, 2000). Different mapping approaches of DRPs exist and differ in several aspects such as time and data required for mapping. On one hand, manual approaches based on intensive field investigations and expert knowledge are reliable but time expensive. On the other hand, GIS-based approaches are easier to realize but rely on simplifications which restrict their application range. Therefore, it is important to investigate to what extent these assumptions are transferable to other catchments. In this study, different GIS-based mapping approaches (Schmocker-Fackel et al., 2007; Müller et al., 2009; Gharari et al., 2011) were used to classify DRPs of two catchments on the Swiss Plateau and were compared to manually derived DRP-maps elaborated using the rule-based approach by Scherrer & Naef (2003). Similarity measures such as mapcurves (Hargrove et al., 2006) and fuzzy kappa statistics (Hagen-Zanker, 2009), as well as a categorical comparison, were performed. Furthermore, the different DRP-mapping approaches are evaluated through synthetic runoff simulations with an adapted version of the well-established hydrological model PREVAH (Viviroli et al., 2009). The different mapping approaches are not unconditionally reasonable for arbitrary catchment characteristics. Generally, all approaches represent the areas where subsurface flow dominates well, whereas they exhibit difficulties with the mapping of very fast and not contributing areas.

  18. Engineering a robotic approach to mapping exposed volcanic fissures

    NASA Astrophysics Data System (ADS)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  19. Endoscopic fluorescence mapping of the left atrium: A novel experimental approach for high resolution endocardial mapping in the intact heart

    PubMed Central

    Kalifa, Jérôme; Klos, Matthew; Zlochiver, Sharon; Mironov, Sergey; Tanaka, Kazuhiko; Ulahannan, Netha; Yamazaki, Masatoshi; Jalife, José; Berenfeld, Omer

    2007-01-01

    Background Despite availability of several mapping technologies to investigate the electrophysiological mechanisms of atrial fibrillation (AF), an experimental tool enabling high resolution mapping of electrical impulse on the endocardial surface of the left atrium is still lacking. Objective To present a new optical mapping approach implementing a steerable cardio-endoscope in isolated hearts. Methods The system consists of a direct or side-view endoscope coupled to a 532 nm excitation Laser for illumination, and to a CCD camera for imaging of potentiometric dye fluorescence (DI-4-ANEPPS, 80×80 pixels, 200–800 frames/sec). The cardio-endoscope was aimed successively at diverse posterior left atrial (PLA) locations to obtain high resolution movies of electrical wave propagation, as well as detailed endocardial anatomical features, in the presence and the absence of atrial stretch. Results We present several examples of high resolution endoscopic PLA recordings of wave propagation patterns during both sinus rhythm and AF with signal-to-noise ratio similar to conventional optical mapping systems. We demonstrate the endoscope’s ability to visualize highly organized AF sources (rotors) at specific locations on the PLA and PLA-pulmonary vein junctions, and present video images of waves emanating from such sources as they propagate into pectinate muscles in the LA appendage. In particular, we demonstrate this approach to be ideally suited for studying the effects of atrial stretch on AF dynamics. Conclusions In isolated hearts, cardio-endoscopic optical mapping of electrical activity should enable comprehensive evaluation of atrial fibrillatory activity in the PLA, of the role of the local anatomy on AF dynamics and of the efficacy of pharmacological and ablative interventions. PMID:17599678

  20. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  1. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  2. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  3. Mapping New Approaches in Program Evaluation: A Cross-Cultural Perspective.

    ERIC Educational Resources Information Center

    Gorostiaga, Jorge M.; Paulston, Rolland G.

    This paper examines new approaches to program evaluation and explores their possible utility in Latin American educational settings. Part 1 briefly discusses why new ideas for evaluating educational studies are needed. Part 2 examines seven new evaluative approaches as follows: (1) "Concept Mapping," a type of structural conceptualization; (2)…

  4. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  5. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  6. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  7. Geomatics Approach for Assessment of respiratory disease Mapping

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Singh, V.; Vaishya, R. C.

    2014-11-01

    Air quality is an important subject of relevance in the context of present times because air is the prime resource for sustenance of life especially human health position. Then with the aid of vast sums of data about ambient air quality is generated to know the character of air environment by utilizing technological advancements to know how well or bad the air is. This report supplies a reliable method in assessing the Air Quality Index (AQI) by using fuzzy logic. The fuzzy logic model is designed to predict Air Quality Index (AQI) that report monthly air qualities. With the aid of air quality index we can evaluate the condition of the environment of that area suitability regarding human health position. For appraisal of human health status in industrial area, utilizing information from health survey questionnaire for obtaining a respiratory risk map by applying IDW and Gettis Statistical Techniques. Gettis Statistical Techniques identifies different spatial clustering patterns like hot spots, high risk and cold spots over the entire work area with statistical significance.

  8. A desktop GIS approach to topographic mapping of surface saturation

    NASA Astrophysics Data System (ADS)

    Garroway, K.; Hopkinson, C.; Jamieson, R.; Boxall, J.

    2009-05-01

    Agricultural watersheds are generally highly modified environments. Accurately modelling topographic features in these environments can be difficult due to surface modifications inherent to agricultural practice, this was addressed by collecting high resolution topographic data. Airborne Laser Scanning (ALS) is a remote sensing technique whereby high resolution and high accuracy elevation data is collected throughout a landscape. In March of 2006 an ALS dataset was collected in the Thomas Brook Watershed located in Annapolis Valley, Nova Scotia. This data was collected over the watershed for high resolution modelling. Multiple topographic indices including topographic position index, topographic wetness index, slope gradient, curvature, and catchment area were modelled using 1m, 5m, and 10m DEM resolutions. The models were then compared to ground sampled soil surface moisture data that were collected during the 2006 and 2007 field seasons. A Student's T- test revealed that the topographic models agreed with the theories of surface wetness prediction, although the direct correlation between the models and the ground data was weak. A landform classification algorithm was augmented to incorporate the topographic models based on the theories of surface wetness prediction and a surface saturation map was generated. Tests revealed that the 5m DEM resolution yielded the most accurate results when compared directly to the surficial sampled surface moisture data. It was shown that the Surface Saturation Landform Classification algorithm can be used to predict zones of surface moisture throughout an agricultural watershed.

  9. A Digital Soil Mapping approach using neural networks for peat depth mapping in Scotland

    NASA Astrophysics Data System (ADS)

    Aitkenhead, Matt; Saunders, Matt; Yeluripati, Jagadeesh

    2014-05-01

    Spatially explicit and accurate peat depth estimates are required for carbon stock assessment, carbon management stategies , hydrological modelling, ecosystem service assessment and land management (e.g. wind farms). In Scotland, a number of surveys have taken place over the years that have produced data on peat depth, and while many of these surveys have focussed on specific locations or peat bogs, a substantial proportion of the data produced is relatively old and has not been digitised, thus limiting its visibility and utility in new research activities, policy development and land management decision making. Here we describe ongoing work where the key objective is to integrate multiple peat survey datasets with existing spatial datasets of climate, vegetation, topography and geology. The dataset produced is generated from a small number of isolated surveys and while it is not representative of all of Scotland's soils, it is sufficient to demonstrate the conceptual basis for model development. It has been used to develop a neural network model of peat depth that has been applied across Scotland's peat bogs at 100m resolution. The resulting map gives an early indication of the variation of peat depth across the country, and allows us to produce an estimate of mean peat bog depth across the country. This estimate will improve with additional data and will contribute to improving our ability to undertake activities that depend on this kind of information. We have identified data gaps that need to be addressed in order to improve this model, in particular peat depth survey data from a wider range of peat types across the country and in particular, blanket bog and upland peat areas. Ongoing work to identify and integrate additional peat bog depth data is described. We also identify potential uses for the existing maps of peat depth, and areas of future model development.

  10. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    NASA Astrophysics Data System (ADS)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  11. Flood Hazard Mapping over Large Regions using Geomorphic Approaches

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Troy, Tara J.; Manfreda, Salvatore

    2016-04-01

    Historically, man has always preferred to settle and live near the water. This tendency has not changed throughout time, and today nineteen of the twenty most populated agglomerations of the world (Demographia World Urban Areas, 2015) are located along watercourses or at the mouth of a river. On one hand, these locations are advantageous from many points of view. On the other hand, they expose significant populations and economic assets to a certain degree of flood hazard. Knowing the location and the extent of the areas exposed to flood hazards is essential to any strategy for minimizing the risk. Unfortunately, in data-scarce regions the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required, and this scarcity is generally most pronounced in developing countries. The present work aims to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments, measuring their performances in identifying the floodplains at the change of the topography and the size of the calibration area. The best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in data-scarce environments and for large-scale applications, providing good accuracy with low requirements in terms of data and computational costs. Keywords: flood hazard, data-scarce regions, large-scale studies, binary classifiers, DEM, USA.

  12. Multidata remote sensing approach to regional geologic mapping in Venezuela

    SciTech Connect

    Baker, R.N.

    1996-08-01

    Remote Sensing played an important role in evaluating the exploration potential of selected lease blocks in Venezuela. Data sets used ranged from regional Landsat and airborne radar (SLAR) surveys to high-quality cloud-free air photos for local but largely inaccessible terrains. The resulting data base provided a framework for the conventional analyses of surface and subsurface information available to the project team. (1) Regional surface geology and major structural elements were interpreted from Landsat MSS imagery supplemented by TM and a regional 1:250,000 airborne radar (SLAR) survey. Evidence of dextral offset, en echelon folds and major thoroughgoing faults suggest a regional transpressional system modified by local extension and readjustment between small-scale crustal blocks. Surface expression of the major structural elements diminishes to the east, but can often be extended beneath the coastal plain by drainage anomalies and subtle geomorphic trends. (2) Environmental conditions were mapped using the high resolution airborne radar images which were used to relate vegetation types to surface texture and elevation; wetlands, outcrop and cultural features to image brightness. Additional work using multispectral TM or SPOT imagery is planned to more accurately define environmental conditions and provide a baseline for monitoring future trends. (3) Offshore oil seeps were detected using ERS-1 satellite radar (SAR) and known seeps in the Gulf of Paria as analogs. While partially successful, natural surfactants, wind shadow and a surprising variety of other phenomena created {open_quotes}false alarms{close_quotes} which required other supporting data and field sampling to verify the results. Key elements of the remote sensing analyses will be incorporated into a comprehensive geographic information (GIS) which will eventually include all of Venezuela.

  13. Evaluation of current statistical approaches for predictive geomorphological mapping

    NASA Astrophysics Data System (ADS)

    Miska, Luoto; Jan, Hjort

    2005-04-01

    Predictive models are increasingly used in geomorphology, but systematic evaluations of novel statistical techniques are still limited. The aim of this study was to compare the accuracy of generalized linear models (GLM), generalized additive models (GAM), classification tree analysis (CTA), neural networks (ANN) and multiple adaptive regression splines (MARS) in predictive geomorphological modelling. Five different distribution models both for non-sorted and sorted patterned ground were constructed on the basis of four terrain parameters and four soil variables. To evaluate the models, the original data set of 9997 squares of 1 ha in size was randomly divided into model training (70%, n=6998) and model evaluation sets (30%, n=2999). In general, active sorted patterned ground is clearly defined in upper fell areas with high slope angle and till soils. Active non-sorted patterned ground is more common in valleys with higher soil moisture and fine-scale concave topography. The predictive performance of each model was evaluated using the area under the receiver operating characteristic curve (AUC) and the Kappa value. The relatively high discrimination capacity of all models, AUC=0.85 0.88 and Kappa=0.49 0.56, implies that the model's predictions provide an acceptable index of sorted and non-sorted patterned ground occurrence. The best performance for model calibration data for both data sets was achieved by the CTA. However, when the predictive mapping ability was explored through the evaluation data set, the model accuracies of CTA decreased clearly compared to the other modelling techniques. For model evaluation data MARS performed marginally best. Our results show that the digital elevation model and soil data can be used to predict relatively robustly the activity of patterned ground in fine scale in a subarctic landscape. This indicates that predictive geomorphological modelling has the advantage of providing relevant and useful information on earth surface

  14. Mapping Transcription Factors on Extended DNA: A Single Molecule Approach

    NASA Astrophysics Data System (ADS)

    Ebenstein, Yuval; Gassman, Natalie; Weiss, Shimon

    The ability to determine the precise loci and distribution of nucleic acid binding proteins is instrumental to our detailed understanding of cellular processes such as transcription, replication, and chromatin reorganization. Traditional molecular biology approaches and above all Chromatin immunoprecipitation (ChIP) based methods have provided a wealth of information regarding protein-DNA interactions. Nevertheless, existing techniques can only provide average properties of these interactions, since they are based on the accumulation of data from numerous protein-DNA complexes analyzed at the ensemble level. We propose a single molecule approach for direct visualization of DNA binding proteins bound specifically to their recognition sites along a long stretch of DNA such as genomic DNA. Fluorescent Quantum dots are used to tag proteins bound to DNA, and the complex is deposited on a glass substrate by extending the DNA to a linear form. The sample is then imaged optically to determine the precise location of the protein binding site. The method is demonstrated by detecting individual, Quantum dot tagged T7-RNA polymerase enzymes on the bacteriophage T7 genomic DNA and assessing the relative occupancy of the different promoters.

  15. Singlet Mott state simulating the bosonic Laughlin wave function

    NASA Astrophysics Data System (ADS)

    Lian, Biao; Zhang, Shoucheng

    2014-01-01

    We study properties of a class of spin-singlet Mott states for arbitrary spin S bosons on a lattice, with particle number per cite n =S/l+1, where l is a positive integer. We show that such a singlet Mott state can be mapped to a bosonic Laughlin wave function on a sphere with a finite number of particles at filling ν =1/2l. Spin, particle, and hole excitations in the Mott state are discussed, among which the hole excitation can be mapped to the quasihole of the bosonic Laughlin wave function. We show that this singlet Mott state can be realized in a cold-atom system on an optical lattice and can be identified using Bragg spectroscopy and Stern-Gerlach techniques. This class of singlet Mott states may be generalized to map to bosonic Laughlin states with filling ν =q/2l.

  16. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    PubMed

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  17. A taxonomy of behaviour change methods: an Intervention Mapping approach

    PubMed Central

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fernández, María E.; Markham, Christine; Bartholomew, L. Kay

    2016-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it

  18. A graph-theoretic approach to comparing and integrating genetic, physical and sequence-based maps.

    PubMed Central

    Yap, Immanuel V; Schneider, David; Kleinberg, Jon; Matthews, David; Cartinhour, Samuel; McCouch, Susan R

    2003-01-01

    For many species, multiple maps are available, often constructed independently by different research groups using different sets of markers and different source material. Integration of these maps provides a higher density of markers and greater genome coverage than is possible using a single study. In this article, we describe a novel approach to comparing and integrating maps by using abstract graphs. A map is modeled as a directed graph in which nodes represent mapped markers and edges define the order of adjacent markers. Independently constructed graphs representing corresponding maps from different studies are merged on the basis of their common loci. Absence of a path between two nodes indicates that their order is undetermined. A cycle indicates inconsistency among the mapping studies with regard to the order of the loci involved. The integrated graph thus produced represents a complete picture of all of the mapping studies that comprise it, including all of the ambiguities and inconsistencies among them. The objective of this representation is to guide additional research aimed at interpreting these ambiguities and inconsistencies in locus order rather than presenting a "consensus order" that ignores these problems. PMID:14704199

  19. [Recent progress in gene mapping through high-throughput sequencing technology and forward genetic approaches].

    PubMed

    Lu, Cairui; Zou, Changsong; Song, Guoli

    2015-08-01

    Traditional gene mapping using forward genetic approaches is conducted primarily through construction of a genetic linkage map, the process of which is tedious and time-consuming, and often results in low accuracy of mapping and large mapping intervals. With the rapid development of high-throughput sequencing technology and decreasing cost of sequencing, a variety of simple and quick methods of gene mapping through sequencing have been developed, including direct sequencing of the mutant genome, sequencing of selective mutant DNA pooling, genetic map construction through sequencing of individuals in population, as well as sequencing of transcriptome and partial genome. These methods can be used to identify mutations at the nucleotide level and has been applied in complex genetic background. Recent reports have shown that sequencing mapping could be even done without the reference of genome sequence, hybridization, and genetic linkage information, which made it possible to perform forward genetic study in many non-model species. In this review, we summarized these new technologies and their application in gene mapping.

  20. Two-dimensional thermofield bosonization

    SciTech Connect

    Amaral, R.L.P.G.

    2005-12-15

    The main objective of this paper was to obtain an operator realization for the bosonization of fermions in 1 + 1 dimensions, at finite, non-zero temperature T. This is achieved in the framework of the real-time formalism of Thermofield Dynamics. Formally, the results parallel those of the T = 0 case. The well-known two-dimensional Fermion-Boson correspondences at zero temperature are shown to hold also at finite temperature. To emphasize the usefulness of the operator realization for handling a large class of two-dimensional quantum field-theoretic problems, we contrast this global approach with the cumbersome calculation of the fermion-current two-point function in the imaginary-time formalism and real-time formalisms. The calculations also illustrate the very different ways in which the transmutation from Fermi-Dirac to Bose-Einstein statistics is realized.

  1. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  2. Mapping paths: new approaches to dissect eukaryotic signaling circuitry.

    PubMed

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely "wired" to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications.

  3. Mapping paths: new approaches to dissect eukaryotic signaling circuitry

    PubMed Central

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely “wired” to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications. PMID:27540473

  4. Supersymmetric Higgs Bosons in Weak Boson Fusion

    SciTech Connect

    Hollik, Wolfgang; Plehn, Tilman; Rauch, Michael; Rzehak, Heidi

    2009-03-06

    We compute the complete supersymmetric next-to-leading-order corrections to the production of a light Higgs boson in weak-boson fusion. The size of the electroweak corrections is of similar order as the next-to-leading-order corrections in the standard model. The supersymmetric QCD corrections turn out to be significantly smaller than expected and than their electroweak counterparts. These corrections are an important ingredient to a precision analysis of the (supersymmetric) Higgs sector at the LHC, either as a known correction factor or as a contribution to the theory error.

  5. Mapping of yellow mosaic virus (YMV) resistance in soybean (Glycine max L. Merr.) through association mapping approach.

    PubMed

    Kumar, Bhupender; Talukdar, Akshay; Verma, Khushbu; Bala, Indu; Harish, G D; Gowda, Sarmrat; Lal, S K; Sapra, R L; Singh, K P

    2015-02-01

    Yellow Mosaic Virus (YMV) is a serious disease of soybean. Resistance to YMV was mapped in 180 soybean genotypes through association mapping approach using 121 simple sequence repeats (SSR) and four resistance gene analogue (RGA)-based markers. The association mapping population (AMP) (96 genotypes) and confirmation population (CP) (84 genotypes) was tested for resistance to YMV at hot-spot consecutively for 3 years (2007-2009). The genotypes exhibited significant variability for YMV resistance (P < 0.01). Molecular genotyping and population structure analysis with 'admixture' co-ancestry model detected seven optimal sub-populations in the AMP. Linkage disequilibrium (LD) between the markers extended up to 35 and 10 cM with r2 > 0.15, and >0.25, respectively. The 4 RGA-based markers showed no association with YMV resistance. Two SSR markers, Satt301 and GMHSP179 on chromosome 17 were found to be in significant LD with YMV resistance. Contingency Chi-square test confirmed the association (P < 0.01) and the utility of the markers was validated in the CP. It would pave the way for marker assisted selection for YMV resistance in soybean. This is the first report of its kind in soybean.

  6. Development and Comparison of Techniques for Generating Permeability Maps using Independent Experimental Approaches

    NASA Astrophysics Data System (ADS)

    Hingerl, Ferdinand; Romanenko, Konstantin; Pini, Ronny; Balcom, Bruce; Benson, Sally

    2014-05-01

    We have developed and evaluated methods for creating voxel-based 3D permeability maps of a heterogeneous sandstone sample using independent experimental data from single phase flow (Magnetic Resonance Imaging, MRI) and two-phase flow (X-ray Computed Tomography, CT) measurements. Fluid velocities computed from the generated permeability maps using computational fluid dynamics simulations fit measured velocities very well and significantly outperform empirical porosity-permeability relations, such as the Kozeny-Carman equation. Acquiring images on the meso-scale from porous rocks using MRI has till recently been a great challenge, due to short spin relaxation times and large field gradients within the sample. The combination of the 13-interval Alternating-Pulsed-Gradient Stimulated-Echo (APGSTE) scheme with three-dimensional Single Point Ramped Imaging with T1 Enhancement (SPRITE) - a technique recently developed at the UNB MRI Center - can overcome these challenges and enables obtaining quantitative 3 dimensional maps of porosities and fluid velocities. Using porosity and (single-phase) velocity maps from MRI and (multi-phase) saturation maps from CT measurements, we employed three different techniques to obtain permeability maps. In the first approach, we applied the Kozeny-Carman relationship to porosities measured using MRI. In the second approach, we computed permeabilities using a J-Leverett scaling method, which is based on saturation maps obtained from N2-H2O multi-phase experiments. The third set of permeabilities was generated using a new inverse iterative-updating technique, which is based on porosities and measured velocities obtained in single-phase flow experiments. The resulting three permeability maps provided then input for computational fluid dynamics simulations - employing the Stanford CFD code AD-GPRS - to generate velocity maps, which were compared to velocity maps measured by MRI. The J-Leveret scaling method and the iterative-updating method

  7. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  8. Force scanning: A rapid, high-resolution approach for spatial mechanical property mapping

    PubMed Central

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the micro-and nano-scales is force mapping, which involves taking individual force curves at discrete sites across a region of interest. Limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straight-forward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue. PMID:21411911

  9. Benthic habitat mapping in a Portuguese Marine Protected Area using EUNIS: An integrated approach

    NASA Astrophysics Data System (ADS)

    Henriques, Victor; Guerra, Miriam Tuaty; Mendes, Beatriz; Gaudêncio, Maria José; Fonseca, Paulo

    2015-06-01

    A growing demand for seabed and habitat mapping has taken place over the past years to support the maritime integrated policies at EU and national levels aiming at the sustainable use of sea resources. This study presents the results of applying the hierarchical European Nature Information System (EUNIS) to classify and map the benthic habitats of the Luiz Saldanha Marine Park, a marine protected area (MPA), located in the mainland Portuguese southwest coast, in the Iberian Peninsula. The habitat map was modelled by applying a methodology based on EUNIS to merge biotic and abiotic key habitat drivers. The modelling in this approach focused on predicting the association of different data types: substrate, bathymetry, light intensity, waves and currents energy, sediment grain size and benthic macrofauna into a common framework. The resulting seamless medium scale habitat map discriminates twenty six distinct sublittoral habitats, including eight with no match in the current classification, which may be regarded as new potential habitat classes and therefore will be submitted to EUNIS. A discussion is provided examining the suitability of the current EUNIS scheme as a standardized approach to classify marine benthic habitats and map their spatial distribution at medium scales in the Portuguese coast. In addition the factors that most affected the results available in the predictive habitat map and the role of the environmental factors on macrofaunal assemblage composition and distribution are outlined.

  10. Mapping raised bogs with an iterative one-class classification approach

    NASA Astrophysics Data System (ADS)

    Mack, Benjamin; Roscher, Ribana; Stenzel, Stefanie; Feilhauer, Hannes; Schmidtlein, Sebastian; Waske, Björn

    2016-10-01

    Land use and land cover maps are one of the most commonly used remote sensing products. In many applications the user only requires a map of one particular class of interest, e.g. a specific vegetation type or an invasive species. One-class classifiers are appealing alternatives to common supervised classifiers because they can be trained with labeled training data of the class of interest only. However, training an accurate one-class classification (OCC) model is challenging, particularly when facing a large image, a small class and few training samples. To tackle these problems we propose an iterative OCC approach. The presented approach uses a biased Support Vector Machine as core classifier. In an iterative pre-classification step a large part of the pixels not belonging to the class of interest is classified. The remaining data is classified by a final classifier with a novel model and threshold selection approach. The specific objective of our study is the classification of raised bogs in a study site in southeast Germany, using multi-seasonal RapidEye data and a small number of training sample. Results demonstrate that the iterative OCC outperforms other state of the art one-class classifiers and approaches for model selection. The study highlights the potential of the proposed approach for an efficient and improved mapping of small classes such as raised bogs. Overall the proposed approach constitutes a feasible approach and useful modification of a regular one-class classifier.

  11. A whole spectroscopic mapping approach for studying the spatial distribution of pigments in paintings

    NASA Astrophysics Data System (ADS)

    Mosca, S.; Alberti, R.; Frizzi, T.; Nevin, A.; Valentini, G.; Comelli, D.

    2016-09-01

    We propose a non-invasive approach for the identification and mapping of pigments in paintings. The method is based on three highly complementary imaging spectroscopy techniques, visible multispectral imaging, X-Ray fluorescence mapping and Raman mapping, combined with multivariate data analysis of multidimensional spectroscopic datasets for the extraction of key distribution information in a semi-automatic way. The proposed approach exploits a macro-Raman mapping device, capable of detecting Raman signals from non-perfectly planar surfaces without the need of refocusing. Here, we show that the presence of spatially correlated Raman signals, detected in adjacent points of a painted surface, reinforces the level of confidence for material identification with respect to single-point analysis, even in the presence of very weak and complex Raman signals. The new whole-mapping approach not only provides the identification of inorganic and organic pigments but also gives striking information on the spatial distribution of pigments employed in complex mixtures for achieving different hues. Moreover, we demonstrate how the synergic combination on three spectroscopic methods, characterized by highly different time consumption, yields maximum information.

  12. Effect of topographic data, geometric configuration and modeling approach on flood inundation mapping

    NASA Astrophysics Data System (ADS)

    Cook, Aaron; Merwade, Venkatesh

    2009-10-01

    SummaryTechnological aspects of producing, delivering and updating of flood hazard maps in the US have has gone through a revolutionary change through Federal Emergency Management Agency's Map Modernization program. In addition, the use of topographic information derived from Light Detection and Ranging (LIDAR) is enabling creation of relatively more accurate flood inundation maps. However, LIDAR is not available for the entire United States. Even for areas, where LIDAR data are available, the effect of other factors such as cross-section configuration in one-dimensional (1D) models, mesh resolution in two-dimensional models (2D), representation of river bathymetry, and modeling approach is not well studied or documented. The objective of this paper is to address some of these issues by comparing newly developed flood inundation maps from LIDAR data to maps that are developed using different topography, geometric description and modeling approach. The methodology involves use of six topographic datasets with different horizontal resolutions, vertical accuracies and bathymetry details. Each topographic dataset is used to create a flood inundation map for twelve different cross-section configurations using 1D HEC-RAS model, and two mesh resolutions using 2D FESWMS model. Comparison of resulting maps for two study areas (Strouds Creek in North Carolina and Brazos River in Texas) show that the flood inundation area reduces with improved horizontal resolution and vertical accuracy in the topographic data. This reduction is further enhanced by incorporating river bathymetry in topography data. Overall, the inundation extent predicted by FESWMS is smaller compared to prediction from HEC-RAS for the study areas, and that the variations in the flood inundation maps arising from different factors are smaller in FESWMS compared to HEC-RAS.

  13. Integrated environmental mapping and monitoring, a methodological approach to optimise knowledge gathering and sampling strategy.

    PubMed

    Nilssen, Ingunn; Ødegård, Øyvind; Sørensen, Asgeir J; Johnsen, Geir; Moline, Mark A; Berge, Jørgen

    2015-07-15

    New technology has led to new opportunities for a holistic environmental monitoring approach adjusted to purpose and object of interest. The proposed integrated environmental mapping and monitoring (IEMM) concept, presented in this paper, describes the different steps in such a system from mission of survey to selection of parameters, sensors, sensor platforms, data collection, data storage, analysis and to data interpretation for reliable decision making. The system is generic; it can be used by authorities, industry and academia and is useful for planning- and operational phases. In the planning process the systematic approach is also ideal to identify areas with gap of knowledge. The critical stages of the concept is discussed and exemplified by two case studies, one environmental mapping and one monitoring case. As an operational system, the IEMM concept can contribute to an optimised integrated environmental mapping and monitoring for knowledge generation as basis for decision making.

  14. Improving Students' Creative Thinking and Achievement through the Implementation of Multiple Intelligence Approach with Mind Mapping

    ERIC Educational Resources Information Center

    Widiana, I. Wayan; Jampel, I. Nyoman

    2016-01-01

    This classroom action research aimed to improve the students' creative thinking and achievement in learning science. It conducted through the implementation of multiple intelligences with mind mapping approach and describing the students' responses. The subjects of this research were the fifth grade students of SD 8 Tianyar Barat, Kubu, and…

  15. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  16. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  17. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  18. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  19. Determination of contact maps in proteins: A combination of structural and chemical approaches

    NASA Astrophysics Data System (ADS)

    Wołek, Karol; Gómez-Sicilia, Àngel; Cieplak, Marek

    2015-12-01

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  20. Determination of contact maps in proteins: A combination of structural and chemical approaches

    SciTech Connect

    Wołek, Karol; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  1. Perturbative bosonization from two-point correlation functions

    NASA Astrophysics Data System (ADS)

    Dalmazi, D.; de Souza Dutra, A.; Hott, Marcelo

    2003-06-01

    Here we address the problem of bosonizing massive fermions without making expansions in the fermion masses in both massive QED2 and QED3 with N fermion flavors including also a Thirring coupling. We start from two-point correlators involving the U(1) fermionic current and the gauge field. From the tensor structure of those correlators we prove that the U(1) current must be identically conserved (topological) in the corresponding bosonized theory in both D=2 and D=3 dimensions. We find an effective generating functional in terms of bosonic fields which reproduces these two-point correlators and from that we obtain a map of the Lagrangian density ψ¯r(i∂/-m)ψr into a bosonic one in both dimensions. This map is nonlocal but it is independent of the electromagnetic and Thirring couplings, at least in the quadratic approximation for the fermionic determinant.

  2. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    NASA Astrophysics Data System (ADS)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  3. Toward real-time three-dimensional mapping of surficial aquifers using a hybrid modeling approach

    NASA Astrophysics Data System (ADS)

    Friedel, Michael J.; Esfahani, Akbar; Iwashita, Fabio

    2016-02-01

    A hybrid modeling approach is proposed for near real-time three-dimensional (3D) mapping of surficial aquifers. First, airborne frequency-domain electromagnetic (FDEM) measurements are numerically inverted to obtain subsurface resistivities. Second, a machine-learning (ML) algorithm is trained using the FDEM measurements and inverted resistivity profiles, and borehole geophysical and hydrogeologic data. Third, the trained ML algorithm is used together with independent FDEM measurements to map the spatial distribution of the aquifer system. Efficacy of the hybrid approach is demonstrated for mapping a heterogeneous surficial aquifer and confining unit in northwestern Nebraska, USA. For this case, independent performance testing reveals that aquifer mapping is unbiased with a strong correlation (0.94) among numerically inverted and ML-estimated binary (clay-silt or sand-gravel) layer resistivities (5-20 ohm-m or 21-5,000 ohm-m), and an intermediate correlation (0.74) for heterogeneous (clay, silt, sand, gravel) layer resistivities (5-5,000 ohm-m). Reduced correlation for the heterogeneous model is attributed to over-estimating the under-sampled high-resistivity gravels (about 0.5 % of the training data), and when removed the correlation increases (0.87). Independent analysis of the numerically inverted and ML-estimated resistivities finds that the hybrid procedure preserves both univariate and spatial statistics for each layer. Following training, the algorithms can map 3D surficial aquifers as fast as leveled FDEM measurements are presented to the ML network.

  4. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    SciTech Connect

    Zhou, Shiguo; Kile, A.; Bechner, M.; Kvikstad, E.; Deng, W.; Wei, J.; Severin, J.; Runnheim, R.; Churas, C.; Forrest, D.; Dimalanta, E.; Lamers, C.; Burland, V.; Blattner, F. R.; Schwartz, David C.

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  5. Improved spatial accuracy of functional maps in the rat olfactory bulb using supervised machine learning approach.

    PubMed

    Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro

    2016-08-15

    Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. PMID:27236085

  6. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach

    PubMed Central

    Windsor, Liliane Cambraia

    2015-01-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study. PMID:26561484

  7. Large-extent digital soil mapping approaches for total soil depth

    NASA Astrophysics Data System (ADS)

    Mulder, Titia; Lacoste, Marine; Saby, Nicolas P. A.; Arrouays, Dominique

    2015-04-01

    Total soil depth (SDt) plays a key role in supporting various ecosystem services and properties, including plant growth, water availability and carbon stocks. Therefore, predictive mapping of SDt has been included as one of the deliverables within the GlobalSoilMap project. In this work SDt was predicted for France following the directions of GlobalSoilMap, which requires modelling at 90m resolution. This first method, further referred to as DM, consisted of modelling the deterministic trend in SDt using data mining, followed by a bias correction and ordinary kriging of the residuals. Considering the total surface area of France, being about 540K km2, employed methods may need to be able dealing with large data sets. Therefore, a second method, multi-resolution kriging (MrK) for large datasets, was implemented. This method consisted of modelling the deterministic trend by a linear model, followed by interpolation of the residuals. For the two methods, the general trend was assumed to be explained by the biotic and abiotic environmental conditions, as described by the Soil-Landscape paradigm. The mapping accuracy was evaluated by an internal validation and its concordance with previous soil maps. In addition, the prediction interval for DM and the confidence interval for MrK were determined. Finally, the opportunities and limitations of both approaches were evaluated. The results showed consistency in mapped spatial patterns and a good prediction of the mean values. DM was better capable in predicting extreme values due to the bias correction. Also, DM was more powerful in capturing the deterministic trend than the linear model of the MrK approach. However, MrK was found to be more straightforward and flexible in delivering spatial explicit uncertainty measures. The validation indicated that DM was more accurate than MrK. Improvements for DM may be expected by predicting soil depth classes. MrK shows potential for modelling beyond the country level, at high

  8. Flood inundation mapping uncertainty introduced by topographic data accuracy, geometric configuration and modeling approach

    NASA Astrophysics Data System (ADS)

    Papaioannou, G.; Loukas, Athanasios

    2010-05-01

    Floodplain modeling is a recently new and applied method in river engineering discipline and is essential for prediction of flood hazards. The issue of flood inundation of upland environments with topographically complex floodplains is an understudied subject. In most areas of the U.S.A., the use of topographic information derived from Light Detection and Ranging (LIDAR) has improved the quality of river flood inundation predictions. However, such high quality topographical data are not available in most countries and the necessary information is obtained by topographical survey and/or topographical maps. Furthermore, the optimum dimensionality of hydraulic models, cross-section configuration in one-dimensional (1D) models, mesh resolution in two-dimensional models (2D) and modeling approach is not well studied or documented. All these factors introduce significant uncertainty in the evaluation of the floodplain zoning. This study addresses some of these issues by comparing flood inundation maps developed using different topography, geometric description and modeling approach. The methodology involves use of topographic datasets with different horizontal resolutions, vertical accuracies and bathymetry details. Each topographic dataset is used to create a flood inundation map for different cross-section configurations using 1D (HEC-RAS) model, and different mesh resolutions using 2D models for steady state and unsteady state conditions. Comparison of resulting maps indicates the uncertainty introduced in floodplain modeling by the horizontal resolution and vertical accuracy of topographic data and the different modeling approaches.

  9. A new GIS approach for reconstructing and mapping dynamic late Holocene coastal plain palaeogeography

    NASA Astrophysics Data System (ADS)

    Pierik, H. J.; Cohen, K. M.; Stouthamer, E.

    2016-10-01

    The geomorphological development of Holocene coastal plains around the world has been studied since the beginning of the twentieth century from various disciplines, resulting in large amounts of data. However, the overwhelming quantities and heterogeneous nature of this data have caused the divided knowledge to remain inconsistent and fragmented. To keep improving the understanding of coastal plain geomorphology and geology, cataloguing of data and integration of knowledge are essential. In this paper we present a GIS that incorporates the accumulated data of the Netherlands' coastal plain and functions as a storage and integration tool for coastal plain mapped data. The GIS stores redigitised architectural elements (beach barriers, tidal channels, intertidal flats, supratidal flats, and coastal fresh water peat) from earlier mappings in separate map layers. A coupled catalogue-style database stores the dating information of these elements, besides references to source studies and annotations regarding changed insights. Using scripts, the system automatically establishes palaeogeographical maps for any chosen moment, combining the above mapping and dating information. In our approach, we strip the information to architectural element level, and we separate mapping from dating information, serving the automatic generation of time slice maps. It enables a workflow in which the maker can iteratively regenerate maps, which speeds up fine-tuning and thus the quality of palaeogeographical reconstruction. The GIS currently covers the late Holocene coastal plain development of the Netherlands. This period witnessed widespread renewed flooding along the southern North Sea coast, coinciding with large-scale reclamation and human occupation. Our GIS method is generic and can be expanded and adapted to allow faster integrated processing of growing amounts of data for many coastal areas and other large urbanising lowlands around the world. It allows maintaining actual data

  10. Letting youths choose for themselves: concept mapping as a participatory approach for program and service planning.

    PubMed

    Minh, Anita; Patel, Sejal; Bruce-Barrett, Cindy; OʼCampo, Patricia

    2015-01-01

    Ensuring that the voices of youths are heard is key in creating services that align with the needs and goals of youths. Concept mapping, a participatory mixed-methods approach, was used to engage youths, families, and service providers in an assessment of service gaps facing youth in an underserviced neighborhood in Toronto, Canada. We describe 6 phases of concept mapping: preparation, brainstorming, sorting and rating, analysis, interpretation, and utilization. Results demonstrate that youths and service providers vary in their conceptualizations of youth service needs and priorities. Implications for service planning and for youth engagement in research are discussed.

  11. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  12. Effects of concept map teaching on students' critical thinking and approach to learning and studying.

    PubMed

    Chen, Shiah-Lian; Liang, Tienli; Lee, Mei-Li; Liao, I-Chen

    2011-08-01

    The purpose of this study was to explore the effects of concept mapping in developing critical thinking ability and approach to learning and studying. A quasi-experimental study design with a purposive sample was drawn from a group of nursing students enrolled in a medical-surgical nursing course in central Taiwan. Students in the experimental group were taught to use concept mapping in their learning. Students in the control group were taught by means of traditional lectures. After the intervention, the experimental group had better overall critical thinking scores than did the control group, although the difference was not statistically significant. After controlling for the effects of age and the pretest score on critical thinking using analysis of covariance, the experimental group had significantly higher adjusted mean scores on inference and overall critical thinking compared with the control group. Concept mapping is an effective tool for improving students' ability to think critically.

  13. Application of a regional approach for hazard mapping at an avalanche site in northern Italy

    NASA Astrophysics Data System (ADS)

    Bocchiola, D.; Rosso, R.

    2008-04-01

    The currently adopted approach to avalanche hazard mapping in northern Italy includes avalanche dynamic modelling, coupled with statistical analysis of snow depth at avalanche start. The 30-years and 300-years return period avalanches at a given site are modelled and their run out zone and pressure are evaluated. The snow depth in the avalanche release zone is assumed to coincide with the three days snow fall depth H72 featuring a return period of 30 years and 300 years, respectively. In the Italian alps only short series of observed snow depth are available, covering a period of 20 years or so, thus requiring a regional approach, or index value approach for the purpose of high return period quantile estimation. Based of former studies, here we apply the index value approach developed for the Lombardia region, in northern Italy, for hazard mapping in a particular avalanche site. A dynamic avalanche model is tuned using the runout data for two major observed avalanche events. Then, the 30-years and 300-years runout zone and dynamic pressure are calculated. It is then shown that the obtained hazard maps are more accurate than those obtained using the evaluation of H72 as deduced from distribution fitting in a single site.

  14. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    PubMed

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  15. High-resolution geologic mapping of the inner continental shelf: Boston Harbor and approaches, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Butman, Bradford; Barnhardt, Walter A.; Danforth, William W.; Crocker, James M.

    2006-01-01

    This report presents the surficial geologic framework data and information for the sea floor of Boston Harbor and Approaches, Massachusetts (fig. 1.1). This mapping was conducted as part of a cooperative program between the U.S. Geological Survey (USGS), the Massachusetts Office of Coastal Zone Management (CZM), and the National Oceanic and Atmospheric Administration (NOAA). The primary objective of this project was to provide sea floor geologic information and maps of Boston Harbor to aid resource management, scientific research, industry and the public. A secondary objective was to test the feasibility of using NOAA hydrographic survey data, normally collected to update navigation charts, to create maps of the sea floor suitable for geologic and habitat interpretations. Defining sea-floor geology is the first steps toward managing ocean resources and assessing environmental changes due to natural or human activity. The geophysical data for these maps were collected as part of hydrographic surveys carried out by NOAA in 2000 and 2001 (fig. 1.2). Bottom photographs, video, and samples of the sediments were collected in September 2004 to help in the interpretation of the geophysical data. Included in this report are high-resolution maps of the sea floor, at a scale of 1:25,000; the data used to create these maps in Geographic Information Systems (GIS) format; a GIS project; and a gallery of photographs of the sea floor. Companion maps of sea floor to the north Boston Harbor and Approaches are presented by Barnhardt and others (2006) and to the east by Butman and others (2003a,b,c). See Butman and others (2004) for a map of Massachusetts Bay at a scale of 1:125,000. The sections of this report are listed in the navigation bar along the left-hand margin of this page. Section 1 (this section) introduces the report. Section 2 presents the large-format map sheets. Section 3 describes data collection, processing, and analysis. Section 4 summarizes the geologic history of

  16. Simulating spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-05-01

    The global coupling of few-level quantum systems (``spins'') to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. In many situations, the bosons are integrated out, leading to effective long-range interactions between the spins; however, strong spin-boson coupling invalidates this approach, and spin-boson entanglement degrades the fidelity of quantum simulation of spin models. We present a general numerical method for treating the out-of-equilibrium dynamics of spin-boson systems based on matrix product states. While most efficient for weak coupling or small numbers of boson modes, our method applies for any spatial and operator dependence of the spin-boson coupling. In addition, our approach allows straightforward computation of many quantities of interest, such as the full counting statistics of collective spin measurements and quantum simulation infidelity due to spin-boson entanglement. We apply our method to ongoing trapped ion quantum simulator experiments in analytically intractable regimes. This work is supported by JILA-NSF-PFC-1125844, NSF-PIF- 1211914, ARO, AFOSR, AFOSR-MURI, and the NRC.

  17. Global land cover mapping at 30 m resolution: A POK-based operational approach

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  18. A Random-Model Approach to QTL Mapping in Multiparent Advanced Generation Intercross (MAGIC) Populations.

    PubMed

    Wei, Julong; Xu, Shizhong

    2016-02-01

    Most standard QTL mapping procedures apply to populations derived from the cross of two parents. QTL detected from such biparental populations are rarely relevant to breeding programs because of the narrow genetic basis: only two alleles are involved per locus. To improve the generality and applicability of mapping results, QTL should be detected using populations initiated from multiple parents, such as the multiparent advanced generation intercross (MAGIC) populations. The greatest challenges of QTL mapping in MAGIC populations come from multiple founder alleles and control of the genetic background information. We developed a random-model methodology by treating the founder effects of each locus as random effects following a normal distribution with a locus-specific variance. We also fit a polygenic effect to the model to control the genetic background. To improve the statistical power for a scanned marker, we release the marker effect absorbed by the polygene back to the model. In contrast to the fixed-model approach, we estimate and test the variance of each locus and scan the entire genome one locus at a time using likelihood-ratio test statistics. Simulation studies showed that this method can increase statistical power and reduce type I error compared with composite interval mapping (CIM) and multiparent whole-genome average interval mapping (MPWGAIM). We demonstrated the method using a public Arabidopsis thaliana MAGIC population and a mouse MAGIC population.

  19. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    PubMed Central

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  20. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement.

  1. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  2. Higgs boson photoproduction at the LHC

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2011-07-15

    We present the current development of the photoproduction approach for the Higgs boson with its application to pp and pA collisions at the LHC. We perform a different analysis for the Gap Survival Probability, where we consider a probability of 3% and also a more optimistic value of 10% based on the HERA data for dijet production. As a result, the cross section for the exclusive Higgs boson production is about 2 fb and 6 fb in pp collisions and 617 and 2056 fb for pPb collisions, considering the gap survival factor of 3% and 10%, respectively.

  3. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    NASA Astrophysics Data System (ADS)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  4. Approaches to modernization of the state soil map of Russia on the basis of the methods of digital soil mapping

    NASA Astrophysics Data System (ADS)

    Korolyuk, T. V.; Ovechkin, S. V.

    2010-05-01

    The ways of further development of the State Soil Map (SSM) of the Russian Federation are discussed. This map represents the most valuable product of soil mapping projects in Russia. It has been developed since the 1930s. A long duration of the map compilation has resulted in certain differences in the methods of mapping, topographic base, and classification decisions applied on separate pages of the map. Another specific feature of the SSM is related to different amounts of soil information available for different parts of the country. At present, two major challenges face researchers working on this map. First, it is necessary to preserve rich information collected by several generations of Russian pedologists and reflected on the map. Second, this information has to be renewed on the basis of new information sources (satellite images, modern topographic base, and new data on soils and their geography), common ideology of the map, and new digital technologies of the map compilation. The analysis of several pages of the map on the territory of the North Caucasus is discussed. An algorithm of introducing corrections to the map and its renewal is suggested. In the course of our work, a digitized version of two pages of the map, the digital elevation model, and Landsat imagery were used. The cartographic work was performed using GeoDraw software. Our experience in the correction and renewal of the SSM attests to the high potential of the methods of digital soil mapping for these purposes. The analysis of the map in the GIS environment makes it possible to reveal possible drawbacks in the initial map pages and to suggest feasible methods of their correction and further development of the SSM.

  5. Improved effective vector boson approximation revisited

    NASA Astrophysics Data System (ADS)

    Bernreuther, Werner; Chen, Long

    2016-03-01

    We reexamine the improved effective vector boson approximation which is based on two-vector-boson luminosities Lpol for the computation of weak gauge-boson hard scattering subprocesses V1V2→W in high-energy hadron-hadron or e-e+ collisions. We calculate these luminosities for the nine combinations of the transverse and longitudinal polarizations of V1 and V2 in the unitary and axial gauge. For these two gauge choices the quality of this approach is investigated for the reactions e-e+→W-W+νeν¯ e and e-e+→t t ¯ νeν¯ e using appropriate phase-space cuts.

  6. An efficient approach to the travelling salesman problem using self-organizing maps.

    PubMed

    Vieira, Frederico Carvalho; Dória Neto, Adrião Duarte; Costa, José Alfredo Ferreira

    2003-04-01

    This paper presents an approach to the well-known Travelling Salesman Problem (TSP) using Self-Organizing Maps (SOM). The SOM algorithm has interesting topological information about its neurons configuration on cartesian space, which can be used to solve optimization problems. Aspects of initialization, parameters adaptation, and complexity analysis of the proposed SOM based algorithm are discussed. The results show an average deviation of 3.7% from the optimal tour length for a set of 12 TSP instances.

  7. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  8. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. PMID:25102957

  9. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making.

  10. A pooling-based approach to mapping genetic variants associated with DNA methylation.

    PubMed

    Kaplow, Irene M; MacIsaac, Julia L; Mah, Sarah M; McEwen, Lisa M; Kobor, Michael S; Fraser, Hunter B

    2015-06-01

    DNA methylation is an epigenetic modification that plays a key role in gene regulation. Previous studies have investigated its genetic basis by mapping genetic variants that are associated with DNA methylation at specific sites, but these have been limited to microarrays that cover <2% of the genome and cannot account for allele-specific methylation (ASM). Other studies have performed whole-genome bisulfite sequencing on a few individuals, but these lack statistical power to identify variants associated with DNA methylation. We present a novel approach in which bisulfite-treated DNA from many individuals is sequenced together in a single pool, resulting in a truly genome-wide map of DNA methylation. Compared to methods that do not account for ASM, our approach increases statistical power to detect associations while sharply reducing cost, effort, and experimental variability. As a proof of concept, we generated deep sequencing data from a pool of 60 human cell lines; we evaluated almost twice as many CpGs as the largest microarray studies and identified more than 2000 genetic variants associated with DNA methylation. We found that these variants are highly enriched for associations with chromatin accessibility and CTCF binding but are less likely to be associated with traits indirectly linked to DNA, such as gene expression and disease phenotypes. In summary, our approach allows genome-wide mapping of genetic variants associated with DNA methylation in any tissue of any species, without the need for individual-level genotype or methylation data.

  11. The W Boson Mass Measurement

    NASA Astrophysics Data System (ADS)

    Kotwal, Ashutosh V.

    2016-10-01

    The measurement of the W boson mass has been growing in importance as its precision has improved, along with the precision of other electroweak observables and the top quark mass. Over the last decade, the measurement of the W boson mass has been led at hadron colliders. Combined with the precise measurement of the top quark mass at hadron colliders, the W boson mass helped to pin down the mass of the Standard Model Higgs boson through its induced radiative correction on the W boson mass. With the discovery of the Higgs boson and the measurement of its mass, the electroweak sector of the Standard Model is over-constrained. Increasing the precision of the W boson mass probes new physics at the TeV-scale. We summarize an extensive Tevatron (1984-2011) program to measure the W boson mass at the CDF and Dø experiments. We highlight the recent Tevatron measurements and prospects for the final Tevatron measurements.

  12. Brachial approach to NOGA-guided procedures: electromechanical mapping and transendocardial stem-cell injections.

    PubMed

    Banovic, Marko; Ostojic, Miodrag C; Bartunek, Jozef; Nedeljkovic, Milan; Beleslin, Branko; Terzic, Andre

    2011-01-01

    Several methods are available for delivering stem cells to the heart. Recent studies have highlighted the advantages of injecting the cells directly into the myocardium in order to increase myocardial retention of cells. A particular focus has been on percutaneous transendocardial injection, facilitated by electromechanical mapping.The NOGA XP Cardiac Navigation System has a multicomponent catheter that is designed to guide and deliver transendocardial injections via a transfemoral approach, without a guidewire. However, this method may not be feasible in some patients who have peripheral vascular disease. Herein, we describe the case of a 68-year-old man whose tortuous, sharply angled iliac arteries precluded a femoral approach to transendocardial injection. To overcome the anatomic and mechanical challenges, we used a brachial approach. We believe that this is the 1st report of using the brachial route for transendocardial injection, and that it can be a viable alternative to the transfemoral approach in selected patients.

  13. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  14. MAP3D: a media processor approach for high-end 3D graphics

    NASA Astrophysics Data System (ADS)

    Darsa, Lucia; Stadnicki, Steven; Basoglu, Chris

    1999-12-01

    Equator Technologies, Inc. has used a software-first approach to produce several programmable and advanced VLIW processor architectures that have the flexibility to run both traditional systems tasks and an array of media-rich applications. For example, Equator's MAP1000A is the world's fastest single-chip programmable signal and image processor targeted for digital consumer and office automation markets. The Equator MAP3D is a proposal for the architecture of the next generation of the Equator MAP family. The MAP3D is designed to achieve high-end 3D performance and a variety of customizable special effects by combining special graphics features with high performance floating-point and media processor architecture. As a programmable media processor, it offers the advantages of a completely configurable 3D pipeline--allowing developers to experiment with different algorithms and to tailor their pipeline to achieve the highest performance for a particular application. With the support of Equator's advanced C compiler and toolkit, MAP3D programs can be written in a high-level language. This allows the compiler to successfully find and exploit any parallelism in a programmer's code, thus decreasing the time to market of a given applications. The ability to run an operating system makes it possible to run concurrent applications in the MAP3D chip, such as video decoding while executing the 3D pipelines, so that integration of applications is easily achieved--using real-time decoded imagery for texturing 3D objects, for instance. This novel architecture enables an affordable, integrated solution for high performance 3D graphics.

  15. Bosonization of Weyl Fermions

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo

    The electron, discovered by Thomson by the end of the nineteenth century, was the first experimentally observed particle. The Weyl fermion, though theoretically predicted since a long time, was observed in a condensed matter environment in an experiment reported only a few weeks ago. Is there any linking thread connecting the first and the last observed fermion (quasi)particles? The answer is positive. By generalizing the method known as bosonization, the first time in its full complete form, for a spacetime with 3+1 dimensions, we are able to show that both electrons and Weyl fermions can be expressed in terms of the same boson field, namely the Kalb-Ramond anti-symmetric tensor gauge field. The bosonized form of the Weyl chiral currents lead to the angle-dependent magneto-conductance behavior observed in these systems.

  16. Extreme rainfall distribution mapping: Comparison of two approaches in West Africa

    NASA Astrophysics Data System (ADS)

    Panthou, G.; Vischel, T.; Lebel, T.; Blanchet, J.; Quantin, G.; Ali, A.

    2012-12-01

    In a world where populations are increasingly exposed to natural hazards, extreme rainfall mapping remains an important subject of research. Extreme rainfall maps are required for both flood risk management and civil engineering structure design, the challenge being to take into account the local information provided by point rainfall series as well as the necessity of some regional coherency. Such a coherency is not guaranteed when extreme value distributions are fitted separately to individual maximum rainfall series. Two approaches based on the extreme value theory (Block Maxima Analysis) are compared here, with an application to extreme rainfall mapping in West Africa. Annual daily rainfall maxima are extracted from rain-gauges series and modeled over the study region by GEV (Generalized Extreme Value) distributions. These two approaches are the following: (i) The Local Fit and Interpolation (LFI) approach which consists of a spatial interpolation of the GEV distribution parameters estimated independently at each raingauge serie. (ii) The Spatial Maximum Likelihood Estimation (SMLE) which directly estimates the GEV distribution over the entire region by a single maximum likelihood fit using jointly all measurements combined with spatial covariates. Five LFI and three SMLE methods are considered, using the information provided by 126 daily rainfall series covering the period 1950-1990. The methods are first evaluated in calibration. Then the predictive skills and the robustness are assessed through a cross validation and an independent network validation process. The SMLE approach, especially when using the mean annual rainfall as covariate, appears to perform better for most of the scores computed. Using a reference series of 104 years of daily data recorded at Niamey (Niger), it is also shown that the SMLE approach has the capacity to deal more efficiently with the effect of local outliers by using the spatial information provided by nearby stations.

  17. Anomalous gauge boson interactions

    SciTech Connect

    Aihara, H.; Barklow, T.; Baur, U. |

    1995-03-01

    We discuss the direct measurement of the trilinear vector boson couplings in present and future collider experiments. The major goals of such experiments will be the confirmation of the Standard Model (SM) predictions and the search for signals of new physics. We review our current theoretical understanding of anomalous trilinear gauge-boson self interactions. If the energy scale of the new physics is {approximately} 1 TeV, these low energy anomalous couplings are expected to be no larger than {Omicron}(10{sup {minus}2}). Constraints from high precision measurements at LEP and low energy charged and neutral current processes are critically reviewed.

  18. Higgs boson hunting

    SciTech Connect

    Dawson, S.; Haber, H.E.; Rindani, S.D.

    1989-05-01

    This is the summary report of the Higgs Boson Working Group. We discuss a variety of search techniques for a Higgs boson which is lighter than the Z. The processes K /yields/ /pi/H, /eta//prime/ /yields/ /eta/H,/Upsilon/ /yields/ H/gamma/ and e/sup +/e/sup /minus// /yields/ ZH are examined with particular attention paid to theoretical uncertainties in the calculations. We also briefly examine new features of Higgs phenomenology in a model which contains Higgs triplets as well as the usual doublet of scalar fields. 33 refs., 6 figs., 1 tab.

  19. A branch-and-cut approach to physical mapping of chromosomes by unique end-probes.

    PubMed

    Christof, T; Jünger, M; Kececioglu, J; Mutzel, P; Reinelt, G

    1997-01-01

    A fundamental problem in computational biology is the construction of physical maps of chromosomes from hybridization experiments between unique probes and clones of chromosome fragments in the presence of error. Alizadeh, Karp, Weisser and Zweig (Algorithmica 13:1/2, 52-76, 1995) first considered a maximum-likelihood model of the problem that is equivalent to finding an ordering of the probes that minimizes a weighted sum of errors and developed several effective heuristics. We show that by exploiting information about the end-probes of clones, this model can be formulated as a Weighted Betweenness Problem. This affords the significant advantage of allowing the well-developed tools of integer linear-programming and branch-and-cut algorithms to be brought to bear on physical mapping, enabling us for the first time to solve small mapping instances to optimality even in the presence of high error. We also show that by combining the optimal solution of many small overlapping Betweenness Problems, one can effectively screen errors from larger instances and solve the edited instance to optimality as a Hamming-Distance Traveling Salesman Problem. This suggests a new approach, a Betweenness-Traveling Salesman hybrid, for constructing physical maps.

  20. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  1. A reciprocal space approach for locating symmetry elements in Patterson superposition maps

    SciTech Connect

    Hendrixson, T.

    1990-09-21

    A method for determining the location and possible existence of symmetry elements in Patterson superposition maps has been developed. A comparison of the original superposition map and a superposition map operated on by the symmetry element gives possible translations to the location of the symmetry element. A reciprocal space approach using structure factor-like quantities obtained from the Fourier transform of the superposition function is then used to determine the best'' location of the symmetry element. Constraints based upon the space group requirements are also used as a check on the locations. The locations of the symmetry elements are used to modify the Fourier transform coefficients of the superposition function to give an approximation of the structure factors, which are then refined using the EG relation. The analysis of several compounds using this method is presented. Reciprocal space techniques for locating multiple images in the superposition function are also presented, along with methods to remove the effect of multiple images in the Fourier transform coefficients of the superposition map. In addition, crystallographic studies of the extended chain structure of (NHC{sub 5}H{sub 5})SbI{sub 4} and of the twinning method of the orthorhombic form of the high-{Tc} superconductor YBa{sub 2}Cu{sub 3}O{sub 7-x} are presented. 54 refs.

  2. Mapping Variable Width Riparian Zones Utilizing Open Source Data: A Robust New Approach

    NASA Astrophysics Data System (ADS)

    Abood, S. A.; Maclean, A.

    2013-12-01

    Riparian buffers are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well-defined vegetation and soil characteristics. Previous approaches to riparian buffer delineation have primarily utilized fixed width buffers. However, these methodologies only take the watercourse into consideration and ignore critical geomorphology, associated vegetation and soil characteristics. Utilizing spatial data readily available from government agencies and geospatial clearinghouses, such as DEMs and the National Hydrography Dataset, the Riparian Buffer Delineation Model (RBDM) offers advantages by harnessing the geospatial modeling capabilities of ArcMap GIS, incorporating a statistically valid sampling technique along the watercourse to accurately map the critical 50-year plain, and delineating a variable width riparian buffer. Options within the model allow incorporation of National Wetlands Inventory (NWI), Soil Survey Data (SSURGO), National Land Cover Data (NLCD) and/or Cropland Data Layer (CDL) to improve the accuracy and utility of the riparian buffers attributes. This approach recognizes the dynamic and transitional natures of riparian buffers by accounting for hydrologic, geomorphic and vegetation data as inputs into the delineation process. By allowing the incorporation of land cover data, decision makers acquire a useful tool to assist in managing riparian buffers. The model is formatted as an ArcMap toolbox for easy installation and does require a Spatial Analyst license. Variable width riparian buffer utilizing 50-year flood height and 10m DEM. RBDM Inputs

  3. An optimization approach for mapping and measuring the divergence and correspondence between paths.

    PubMed

    Mueller, Shane T; Perelman, Brandon S; Veinott, Elizabeth S

    2016-03-01

    Many domains of empirical research produce or analyze spatial paths as a measure of behavior. Previously, approaches for measuring the similarity or deviation between two paths have either required timing information or have used ad hoc or manual coding schemes. In this paper, we describe an optimization approach for robustly measuring the area-based deviation between two paths we call ALCAMP (Algorithm for finding the Least-Cost Areal Mapping between Paths). ALCAMP measures the deviation between two paths and produces a mapping between corresponding points on the two paths. The method is robust to a number of aspects in real path data, such as crossovers, self-intersections, differences in path segmentation, and partial or incomplete paths. Unlike similar algorithms that produce distance metrics between trajectories (i.e., paths that include timing information), this algorithm uses only the order of observed path segments to determine the mapping. We describe the algorithm and show its results on a number of sample problems and data sets, and demonstrate its effectiveness for assessing human memory for paths. We also describe available software code written in the R statistical computing language that implements the algorithm to enable data analysis.

  4. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. PMID:23339716

  5. An expert knowledge-based approach to landslide susceptibility mapping using GIS and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Zhu, A.-Xing; Wang, Rongxun; Qiao, Jianping; Qin, Cheng-Zhi; Chen, Yongbo; Liu, Jing; Du, Fei; Lin, Yang; Zhu, Tongxin

    2014-06-01

    This paper presents an expert knowledge-based approach to landslide susceptibility mapping in an effort to overcome the deficiencies of data-driven approaches. The proposed approach consists of three generic steps: (1) extraction of knowledge on the relationship between landslide susceptibility and predisposing factors from domain experts, (2) characterization of predisposing factors using GIS techniques, and (3) prediction of landslide susceptibility under fuzzy logic. The approach was tested in two study areas in China - the Kaixian study area (about 250 km2) and the Three Gorges study area (about 4600 km2). The Kaixian study area was used to develop the approach and to evaluate its validity. The Three Gorges study area was used to test both the portability and the applicability of the developed approach for mapping landslide susceptibility over large study areas. Performance was evaluated by examining if the mean of the computed susceptibility values at landslide sites was statistically different from that of the entire study area. A z-score test was used to examine the statistical significance of the difference. The computed z for the Kaixian area was 3.70 and the corresponding p-value was less than 0.001. This suggests that the computed landslide susceptibility values are good indicators of landslide occurrences. In the Three Gorges study area, the computed z was 10.75 and the corresponding p-value was less than 0.001. In addition, we divided the susceptibility value into four levels: low (0.0-0.25), moderate (0.25-0.5), high (0.5-0.75) and very high (0.75-1.0). No landslides were found for areas of low susceptibility. Landslide density was about three times higher in areas of very high susceptibility than that in the moderate susceptibility areas, and more than twice as high as that in the high susceptibility areas. The results from the Three Gorge study area suggest that the extracted expert knowledge can be extrapolated to another study area and the

  6. A direct approach to generalised multiple mapping conditioning for selected turbulent diffusion flame cases

    NASA Astrophysics Data System (ADS)

    Sundaram, Brruntha; Klimenko, Alexander Yuri; Cleary, Matthew John; Ge, Yipeng

    2016-07-01

    This work presents a direct and transparent interpretation of two concepts for modelling turbulent combustion: generalised Multiple Mapping Conditioning (MMC) and sparse-Lagrangian Large Eddy Simulation (LES). The MMC approach is presented as a hybrid between the Probability Density Function (PDF) method and approaches based on conditioning (e.g. Conditional Moment Closure, flamelet, etc.). The sparse-Lagrangian approach, which allows for a dramatic reduction of computational cost, is viewed as an alternative interpretation of the Filtered Density Function (FDF) methods. This work presents simulations of several turbulent diffusion flame cases and discusses the universality of the localness parameter between these cases and the universality of sparse-Lagrangian FDF methods with MMC.

  7. ConMap: Investigating new computer-based approaches to assessing conceptual knowledge structure in physics

    NASA Astrophysics Data System (ADS)

    Beatty, Ian D.

    2000-06-01

    There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap (``Conceptual Mapping'') project, described in this dissertation, proposed and investigated some novel methods for assessing the conceptual knowledge structure of physics students. A set of brief computer-administered tasks for eliciting students' conceptual associations was designed. The basic approach of the tasks was to elicit spontaneous term associations from subjects by presenting them with a prompt term, or problem, or topic area, and having them type a set of response terms. Each response was recorded along with the time spent thinking of and typing it. Several studies were conducted in which data was collected on introductory physics students' performance on the tasks. A detailed statistical description of the data was compiled. Phenomenological characterization of the data (description and statistical summary of observed patterns) provided insight into the way students respond to the tasks, and discovered some notable features to guide modeling efforts. Possible correlations were investigated, some among different aspects of the ConMap data, others between aspects of the data and students' in-course exam scores. Several correlations were found which suggest that the ConMap tasks can successfully reveal information about students' knowledge structuring and level of expertise. Similarity was observed between data from one of the tasks and results from a traditional concept map task. Two rudimentary quantitative models for the temporal aspects of student performance on one of the tasks were constructed, one based on random probability distributions and the other on a detailed deterministic representation of conceptual knowledge structure. Both models were

  8. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. PMID:24315681

  9. A Probabilistic Approach to Receptive Field Mapping in the Frontal Eye Fields

    PubMed Central

    Mayo, J. Patrick; Morrison, Robert M.; Smith, Matthew A.

    2016-01-01

    Studies of the neuronal mechanisms of perisaccadic vision often lack the resolution needed to determine important changes in receptive field (RF) structure. Such limited analytical power can lead to inaccurate descriptions of visuomotor processing. To address this issue, we developed a precise, probabilistic technique that uses a generalized linear model (GLM) for mapping the visual RFs of frontal eye field (FEF) neurons during stable fixation (Mayo et al., 2015). We previously found that full-field RF maps could be obtained using 1–8 dot stimuli presented at frame rates of 10–150 ms. FEF responses were generally robust to changes in the number of stimuli presented or the rate of presentation, which allowed us to visualize RFs over a range of spatial and temporal resolutions. Here, we compare the quality of RFs obtained over different stimulus and GLM parameters to facilitate future work on the detailed mapping of FEF RFs. We first evaluate the interactions between the number of stimuli presented per trial, the total number of trials, and the quality of RF mapping. Next, we vary the spatial resolution of our approach to illustrate the tradeoff between visualizing RF sub-structure and sampling at high resolutions. We then evaluate local smoothing as a possible correction for situations where under-sampling occurs. Finally, we provide a preliminary demonstration of the usefulness of a probabilistic approach for visualizing full-field perisaccadic RF shifts. Our results present a powerful, and perhaps necessary, framework for studying perisaccadic vision that is applicable to FEF and possibly other visuomotor regions of the brain. PMID:27047352

  10. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  11. Functional Connectivity-Based Parcellation of Amygdala Using Self-Organized Mapping: A Data Driven Approach

    PubMed Central

    Mishra, Arabinda; Rogers, Baxter P.; Chen, Li Min; Gore, John C.

    2013-01-01

    The overall goal of this work is to demonstrate how resting state functional magnetic resonance imaging (fMRI) signals may be used to objectively parcellate functionally heterogeneous subregions of the human amygdala into structures characterized by similar patterns of functional connectivity. We hypothesize that similarity of functional connectivity of subregions with other parts of the brain can be a potential basis to segment and cluster voxels using data driven approaches. In this work, self-organizing map (SOM) was implemented to cluster the connectivity maps associated with each voxel of the human amygdala, thereby defining distinct subregions. The functional separation was optimized by evaluating the overall differences in functional connectivity between the subregions at group level. Analysis of 25 resting state fMRI data sets suggests that SOM can successfully identify functionally independent nuclei based on differences in their inter subregional functional connectivity, evaluated statistically at various confidence levels. Although amygdala contains several nuclei whose distinct roles are implicated in various functions, our objective approach discerns at least two functionally distinct volumes comparable to previous parcellation results obtained using probabilistic tractography and cytoarchitectonic analysis. Association of these nuclei with various known functions and a quantitative evaluation of their differences in overall functional connectivity with lateral orbital frontal cortex and temporal pole confirms the functional diversity of amygdala. The data driven approach adopted here may be used as a powerful indicator of structure–function relationships in the amygdala and other functionally heterogeneous structures as well. PMID:23418140

  12. An internal state variable mapping approach for Li-Plating diagnosis

    NASA Astrophysics Data System (ADS)

    Bai, Guangxing; Wang, Pingfeng

    2016-08-01

    Li-ion battery failure becomes one of major challenges for reliable battery applications, as it could cause catastrophic consequences. Compared with capacity fading resulted from calendar effects, Li-plating induced battery failures are more difficult to identify, as they causes sudden capacity loss leaving limited time for failure diagnosis. This paper presents a new internal state variable (ISV) mapping approach to identify values of immeasurable battery ISVs considering changes of inherent parameters of battery system dynamics for Li-plating diagnosis. Employing the developed ISV mapping approach, an explicit functional relationship model between measurable battery signals and immeasurable battery ISVs can be developed. The developed model can then be used to identify ISVs from an online battery system for the occurrence identification of Li-plating. Employing multiphysics based simulation of Li-plating using COMSOL, the proposed Li-plating diagnosis approach is implemented under different conditions in the case studies to demonstrate its efficacy in diagnosis of Li-plating onset timings.

  13. Target fishing of glycopentalone using integrated inverse docking and reverse pharmacophore mapping approach.

    PubMed

    Gurung, A B; Ali, M A; Bhattacharjee, A; Al-Anazi, K M; Farah, M A; Al-Hemaid, F M; Abou-Tarboush, F M; Lee, J; Kim, S Y; Al-Anazi, F S M

    2016-01-01

    Glycopentalone isolated from Glycosmis pentaphylla (family Rutaceae) has cytotoxic and apoptosis inducing effects in various human cancer cell lines; however, its mode of action is not known. Therefore, target fishing of glycopentalone using a combined approach of inverse docking and reverse pharmacophore mapping approach was used to identify potential targets of glycopentalone, and gain insight into its binding modes against the selected molecular targets, viz., CDK-2, CDK-6, Topoisomerase I, Bcl-2, VEGFR-2, Telomere:G-quadruplex and Topoisomerase II. These targets were chosen based on their key roles in the progression of cancer via regulation of cell cycle and DNA replication. Molecular docking analysis revealed that glycopentalone displayed binding energies ranging from -6.38 to -8.35 kcal/mol and inhibition constants ranging from 0.758 to 20.90 μM. Further, the binding affinities of glycopentalone to the targets were in the order: Telomere:G-quadruplex > VEGFR-2 > CDK-6 > CDK-2 > Topoisomerase II > Topoisomerase I > Bcl-2. Binding mode analysis revealed critical hydrogen bonds as well as hydrophobic interactions with the targets. The targets were validated by reverse pharmacophore mapping of glycopentalone against a set of 2241 known human target proteins which revealed CDK-2 and VEGFR-2 as the most favorable targets. The glycopentalone was well mapped to CDK-2 and VEGFR-2 which involve six pharmacophore features (two hydrophobic centers and four hydrogen bond acceptors) and nine pharmacophore features (five hydrophobic, two hydrogen bond acceptors and two hydrogen bond donors), respectively. The present computational approach may aid in rational identification of targets for small molecules against large set of candidate macromolecules before bioassays validation. PMID:27525951

  14. Expansion of a quantum degenerate boson-fermion mixture

    SciTech Connect

    Hu, Hui; Liu, Xia-Ji; Modugno, Michele

    2003-06-01

    We study the expansion of an ultracold boson-fermion mixture released from an elongated magnetic trap, by using a scaling approach. We discuss in detail the role of the boson-fermion interaction on the evolution of the radial-to-axial aspect ratio of the condensate, and show that the latter depends crucially on the relative dynamics of the condensate and degenerate Fermi gas in the radial direction, which is characterized by the ratio between the trapping frequencies for fermions and bosons. The numerical solution of the scaling equations provides a reasonable agreement with the recent experiment [G. Roati et al., Phys. Rev. Lett. 89, 150403 (2002)].

  15. Resummation of Goldstone boson contributions to the MSSM effective potential

    NASA Astrophysics Data System (ADS)

    Kumar, Nilanjana; Martin, Stephen P.

    2016-07-01

    We discuss the resummation of the Goldstone boson contributions to the effective potential of the minimal supersymmetric Standard Model. This eliminates the formal problems of spurious imaginary parts and logarithmic singularities in the minimization conditions when the tree-level Goldstone boson squared masses are negative or approach zero. The numerical impact of the resummation is shown to be almost always very small. We also show how to write the two-loop minimization conditions so that Goldstone boson squared masses do not appear at all, and so that they can be solved without iteration.

  16. Unitarity-controlled resonances after the Higgs boson discovery

    NASA Astrophysics Data System (ADS)

    Englert, Christoph; Harris, Philip; Spannowsky, Michael; Takeuchi, Michihisa

    2015-07-01

    If the recently discovered Higgs boson's couplings deviate from the Standard Model expectation, we may anticipate new resonant physics in the weak boson fusion channels resulting from high scale unitarity sum rules of longitudinal gauge boson scattering. Motivated by excesses in analyses of multi-leptons + missing energy + jets final states during run 1, we perform a phenomenological investigation of these channels at the LHC bounded by current Higgs coupling constraints. Such an approach constrains the prospects to observe such new physics at the LHC as a function of very few and generic parameters and allows the investigation of the strong requirement of probability conservation in the electroweak sector to high energies.

  17. A self organizing map approach to physiological data analysis for enhanced group performance.

    SciTech Connect

    Doser, Adele Beatrice; Merkle, Peter Benedict

    2004-10-01

    A Self Organizing Map (SOM) approach was used to analyze physiological data taken from a group of subjects participating in a cooperative video shooting game. The ultimate aim was to discover signatures of group cooperation, conflict, leadership, and performance. Such information could be fed back to participants in a meaningful way, and ultimately increase group performance in national security applications, where the consequences of a poor group decision can be devastating. Results demonstrated that a SOM can be a useful tool in revealing individual and group signatures from physiological data, and could ultimately be used to heighten group performance.

  18. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach.

    PubMed

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  19. Large Deformation Multiresolution Diffeomorphic Metric Mapping for Multiresolution Cortical Surfaces: A Coarse-to-Fine Approach.

    PubMed

    Tan, Mingzhen; Qiu, Anqi

    2016-09-01

    Brain surface registration is an important tool for characterizing cortical anatomical variations and understanding their roles in normal cortical development and psychiatric diseases. However, surface registration remains challenging due to complicated cortical anatomy and its large differences across individuals. In this paper, we propose a fast coarse-to-fine algorithm for surface registration by adapting the large diffeomorphic deformation metric mapping (LDDMM) framework for surface mapping and show improvements in speed and accuracy via a multiresolution analysis of surface meshes and the construction of multiresolution diffeomorphic transformations. The proposed method constructs a family of multiresolution meshes that are used as natural sparse priors of the cortical morphology. At varying resolutions, these meshes act as anchor points where the parameterization of multiresolution deformation vector fields can be supported, allowing the construction of a bundle of multiresolution deformation fields, each originating from a different resolution. Using a coarse-to-fine approach, we show a potential reduction in computation cost along with improvements in sulcal alignment when compared with LDDMM surface mapping. PMID:27254865

  20. Pervasive Radio Mapping of Industrial Environments Using a Virtual Reality Approach

    PubMed Central

    Nedelcu, Adrian-Valentin; Machedon-Pisu, Mihai; Duguleana, Mihai; Talaba, Doru

    2015-01-01

    Wireless communications in industrial environments are seriously affected by reliability and performance issues, due to the multipath nature of obstacles within such environments. Special attention needs to be given to planning a wireless industrial network, so as to find the optimum spatial position for each of the nodes within the network, and especially for key nodes such as gateways or cluster heads. The aim of this paper is to present a pervasive radio mapping system which captures (senses) data regarding the radio spectrum, using low-cost wireless sensor nodes. This data is the input of radio mapping algorithms that generate electromagnetic propagation profiles. Such profiles are used for identifying obstacles within the environment and optimum propagation pathways. With the purpose of further optimizing the radio planning process, the authors propose a novel human-network interaction (HNI) paradigm that uses 3D virtual environments in order to display the radio maps in a natural, easy-to-perceive manner. The results of this approach illustrate its added value to the field of radio resource planning of industrial communication systems. PMID:26167533

  1. An entropy-driven matrix completion (E-MC) approach to complex network mapping

    NASA Astrophysics Data System (ADS)

    Koochakzadeh, Ali; Pal, Piya

    2016-05-01

    Mapping the topology of a complex network in a resource-efficient manner is a challenging problem with applications in internet mapping, social network inference, and so forth. We propose a new entropy driven algorithm leveraging ideas from matrix completion, to map the network using monitors (or sensors) which, when placed on judiciously selected nodes, are capable of discovering their immediate neighbors. The main challenge is to maximize the portion of discovered network using only a limited number of available monitors. To this end, (i) a new measure of entropy or uncertainty is associated with each node, in terms of the currently discovered edges incident on that node, and (ii) a greedy algorithm is developed to select a candidate node for monitor placement based on its entropy. Utilizing the fact that many complex networks of interest (such as social networks), have a low-rank adjacency matrix, a matrix completion algorithm, namely 1-bit matrix completion, is combined with the greedy algorithm to further boost its performance. The low rank property of the network adjacency matrix can be used to extrapolate a portion of missing edges, and consequently update the node entropies, so as to efficiently guide the network discovery algorithm towards placing monitors on the nodes that can turn out to be more informative. Simulations performed on a variety of real world networks such as social networks and peer networks demonstrate the superior performance of the matrix-completion guided approach in discovering the network topology.

  2. A Novel Approach on Designing Augmented Fuzzy Cognitive Maps Using Fuzzified Decision Trees

    NASA Astrophysics Data System (ADS)

    Papageorgiou, Elpiniki I.

    This paper proposes a new methodology for designing Fuzzy Cognitive Maps using crisp decision trees that have been fuzzified. Fuzzy cognitive map is a knowledge-based technique that works as an artificial cognitive network inheriting the main aspects of cognitive maps and artificial neural networks. Decision trees, in the other hand, are well known intelligent techniques that extract rules from both symbolic and numeric data. Fuzzy theoretical techniques are used to fuzzify crisp decision trees in order to soften decision boundaries at decision nodes inherent in this type of trees. Comparisons between crisp decision trees and the fuzzified decision trees suggest that the later fuzzy tree is significantly more robust and produces a more balanced decision making. The approach proposed in this paper could incorporate any type of fuzzy decision trees. Through this methodology, new linguistic weights were determined in FCM model, thus producing augmented FCM tool. The framework is consisted of a new fuzzy algorithm to generate linguistic weights that describe the cause-effect relationships among the concepts of the FCM model, from induced fuzzy decision trees.

  3. A new approach to the statistical treatment of 2D-maps in proteomics using fuzzy logic.

    PubMed

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio

    2003-01-01

    A new approach to the statistical treatment of 2D-maps has been developed. This method is based on the use of fuzzy logic and allows to take into consideration the typical low reproducibility of 2D-maps. In this approach the signal corresponding to the presence of proteins on the 2D-maps is substituted with probability functions, centred on the signal itself. The standard deviation of the bidimensional gaussian probability function employed to blur the signal allows to assign different uncertainties to the two electrophoretic dimensions. The effect of changing the standard deviation and the digitalisation resolution are investigated. PMID:12650579

  4. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids

    PubMed Central

    Tangprasertchai, Narin S.; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S.; Qin, Peter Z.

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve “correct” all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  5. An Integrated Spin-Labeling/Computational-Modeling Approach for Mapping Global Structures of Nucleic Acids.

    PubMed

    Tangprasertchai, Narin S; Zhang, Xiaojun; Ding, Yuan; Tham, Kenneth; Rohs, Remo; Haworth, Ian S; Qin, Peter Z

    2015-01-01

    The technique of site-directed spin labeling (SDSL) provides unique information on biomolecules by monitoring the behavior of a stable radical tag (i.e., spin label) using electron paramagnetic resonance (EPR) spectroscopy. In this chapter, we describe an approach in which SDSL is integrated with computational modeling to map conformations of nucleic acids. This approach builds upon a SDSL tool kit previously developed and validated, which includes three components: (i) a nucleotide-independent nitroxide probe, designated as R5, which can be efficiently attached at defined sites within arbitrary nucleic acid sequences; (ii) inter-R5 distances in the nanometer range, measured via pulsed EPR; and (iii) an efficient program, called NASNOX, that computes inter-R5 distances on given nucleic acid structures. Following a general framework of data mining, our approach uses multiple sets of measured inter-R5 distances to retrieve "correct" all-atom models from a large ensemble of models. The pool of models can be generated independently without relying on the inter-R5 distances, thus allowing a large degree of flexibility in integrating the SDSL-measured distances with a modeling approach best suited for the specific system under investigation. As such, the integrative experimental/computational approach described here represents a hybrid method for determining all-atom models based on experimentally-derived distance measurements. PMID:26477260

  6. Flight investigation of helicopter IFR approaches to oil rigs using airborne weather and mapping radar

    NASA Technical Reports Server (NTRS)

    Bull, J. S.; Hegarty, D. M.; Phillips, J. D.; Sturgeon, W. R.; Hunting, A. W.; Pate, D. P.

    1979-01-01

    Airborne weather and mapping radar is a near-term, economical method of providing 'self-contained' navigation information for approaches to offshore oil rigs and its use has been rapidly expanding in recent years. A joint NASA/FAA flight test investigation of helicopter IFR approaches to offshore oil rigs in the Gulf of Mexico was initiated in June 1978 and conducted under contract to Air Logistics. Approximately 120 approaches were flown in a Bell 212 helicopter by 15 operational pilots during the months of August and September 1978. The purpose of the tests was to collect data to (1) support development of advanced radar flight director concepts by NASA and (2) aid the establishment of Terminal Instrument Procedures (TERPS) criteria by the FAA. The flight test objectives were to develop airborne radar approach procedures, measure tracking errors, determine accpetable weather minimums, and determine pilot acceptability. Data obtained will contribute significantly to improved helicopter airborne radar approach capability and to the support of exploration, development, and utilization of the Nation's offshore oil supplies.

  7. Comparing Point Count System and physically-based approaches to map aquifer vulnerability

    NASA Astrophysics Data System (ADS)

    Lagomarsino, D.; Martina, M. L. V.; Todini, E.

    2009-04-01

    Pollution vulnerability maps of aquifers are an important instrument for land and water management. These maps are generally based on simplified Point Count System Models (PCSM), such as DRASTIC or SINTACS, without the use of physically based groundwater models, which may provide more accurate results. The present research aims at finding a trade-off between the accuracy provided by a physically-based model, which inevitably involves higher computational complexity and data requirements, and the coarser, albeit simpler and easy to implement, approach provided by an indicator based model such as one of the most important PCSM, the DRASTIC model (Aller et al., 1987). The alluvial aquifer of the conoid of the Reno River, extending from pedemountain hills of the Apennines chain towards Po plain, is one of the main sources of drinking water for the city of Bologna. The parameters considered by DRASTIC (Depth of water, net Recharge, Aquifer media, Soil media, Topography, Impact of vadose zone and hydraulic Conductivity) represent the main hydrogeological and environmental parameters that influence the pollution transport from the surface towards the groundwater. The real flow of the Reno aquifer, was then simulated by means of a finite element model (FEFLOW) that takes into account the physical processes of water movement and the associated transport of contaminant in the environment. The results obtained by the model have been compared with the DRASTIC vulnerability map. A preliminary analysis of the vulnerability map, based on chemical analyses of water, reveals that the concentration of Nitrates is actually higher in those zones where higher vulnerability values were found.

  8. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    PubMed

    Wallace, Chris; Cutler, Antony J; Pontikos, Nikolas; Pekalski, Marcin L; Burren, Oliver S; Cooper, Jason D; García, Arcadio Rubio; Ferreira, Ricardo C; Guo, Hui; Walker, Neil M; Smyth, Deborah J; Rich, Stephen S; Onengut-Gumuscu, Suna; Sawcer, Stephen J; Ban, Maria; Richardson, Sylvia; Todd, John A; Wicker, Linda S

    2015-06-01

    Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD) and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS) and type 1 diabetes (T1D) associations in the IL-2RA (CD25) gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3) and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  9. The field line map approach for simulations of magnetically confined plasmas

    NASA Astrophysics Data System (ADS)

    Stegmeir, Andreas; Coster, David; Maj, Omar; Hallatschek, Klaus; Lackner, Karl

    2016-01-01

    Predictions of plasma parameters in the edge and scrape-off layer of tokamaks is difficult since most modern tokamaks have a divertor and the associated separatrix causes the usually employed field/flux-aligned coordinates to become singular on the separatrix/X-point. The presented field line map approach avoids such problems as it is based on a cylindrical grid: standard finite-difference methods can be used for the discretisation of perpendicular (w.r.t. magnetic field) operators, and the characteristic flute mode property (k∥ ≪k⊥) of structures is exploited computationally via a field line following discretisation of parallel operators which leads to grid sparsification in the toroidal direction. This paper is devoted to the discretisation of the parallel diffusion operator (the approach taken is very similar to the flux-coordinate independent (FCI) approach which has already been adopted to a hyperbolic problem (Ottaviani, 2011; Hariri, 2013)). Based on the support operator method, schemes are derived which maintain the self-adjointness property of the parallel diffusion operator on the discrete level. These methods have very low numerical perpendicular diffusion compared to a naive discretisation which is a critical issue since magnetically confined plasmas exhibit a very strong anisotropy. Two different versions of the discrete parallel diffusion operator are derived: the first is based on interpolation where the order of interpolation and therefore the numerical diffusion is adjustable; the second is based on integration and is advantageous in cases where the field line map is strongly distorted. The schemes are implemented in the new code GRILLIX, and extensive benchmarks and numerous examples are presented which show the validity of the approach in general and GRILLIX in particular.

  10. Non-invasive computation of aortic pressure maps: a phantom-based study of two approaches

    NASA Astrophysics Data System (ADS)

    Delles, Michael; Schalck, Sebastian; Chassein, Yves; Müller, Tobias; Rengier, Fabian; Speidel, Stefanie; von Tengg-Kobligk, Hendrik; Kauczor, Hans-Ulrich; Dillmann, Rüdiger; Unterhinninghofen, Roland

    2014-03-01

    Patient-specific blood pressure values in the human aorta are an important parameter in the management of cardiovascular diseases. A direct measurement of these values is only possible by invasive catheterization at a limited number of measurement sites. To overcome these drawbacks, two non-invasive approaches of computing patient-specific relative aortic blood pressure maps throughout the entire aortic vessel volume are investigated by our group. The first approach uses computations from complete time-resolved, three-dimensional flow velocity fields acquired by phasecontrast magnetic resonance imaging (PC-MRI), whereas the second approach relies on computational fluid dynamics (CFD) simulations with ultrasound-based boundary conditions. A detailed evaluation of these computational methods under realistic conditions is necessary in order to investigate their overall robustness and accuracy as well as their sensitivity to certain algorithmic parameters. We present a comparative study of the two blood pressure computation methods in an experimental phantom setup, which mimics a simplified thoracic aorta. The comparative analysis includes the investigation of the impact of algorithmic parameters on the MRI-based blood pressure computation and the impact of extracting pressure maps in a voxel grid from the CFD simulations. Overall, a very good agreement between the results of the two computational approaches can be observed despite the fact that both methods used completely separate measurements as input data. Therefore, the comparative study of the presented work indicates that both non-invasive pressure computation methods show an excellent robustness and accuracy and can therefore be used for research purposes in the management of cardiovascular diseases.

  11. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps.

    PubMed

    Santos, Diego G; Fernandes, Bruno J T; Bezerra, Byron L D

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  12. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    NASA Astrophysics Data System (ADS)

    Dhindsa, Harkirat S.; Makarimi-Kasim; Roger Anderson, O.

    2011-04-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample of the study consisted of six classes (140 Form 3 students of 13-15 years old) selected from a typical coeducational school in Brunei. Three classes (40 boys and 30 girls) were taught using the TTA while three other classes (41 boys and 29 girls) used the CMA, enriched with PowerPoint presentations. After the interventions (lessons on magnetism), the students in both groups were asked to describe in writing their understanding of magnetism accrued from the lessons. Their written descriptions were analyzed using flow map analyses to assess their content knowledge and its organisation in memory as evidence of cognitive structure. The extent of CLE was measured using a published CLE survey. The results showed that the cognitive structures of the CMA students were more extensive, thematically organised and richer in interconnectedness of thoughts than those of TTA students. Moreover, CMA students also perceived their classroom learning environment to be more constructivist than their counterparts. It is, therefore, recommended that teachers consider using the CMA teaching technique to help students enrich their understanding, especially for more complex or abstract scientific content.

  13. HAGR-D: A Novel Approach for Gesture Recognition with Depth Maps

    PubMed Central

    Santos, Diego G.; Fernandes, Bruno J. T.; Bezerra, Byron L. D.

    2015-01-01

    The hand is an important part of the body used to express information through gestures, and its movements can be used in dynamic gesture recognition systems based on computer vision with practical applications, such as medical, games and sign language. Although depth sensors have led to great progress in gesture recognition, hand gesture recognition still is an open problem because of its complexity, which is due to the large number of small articulations in a hand. This paper proposes a novel approach for hand gesture recognition with depth maps generated by the Microsoft Kinect Sensor (Microsoft, Redmond, WA, USA) using a variation of the CIPBR (convex invariant position based on RANSAC) algorithm and a hybrid classifier composed of dynamic time warping (DTW) and Hidden Markov models (HMM), called the hybrid approach for gesture recognition with depth maps (HAGR-D). The experiments show that the proposed model overcomes other algorithms presented in the literature in hand gesture recognition tasks, achieving a classification rate of 97.49% in the MSRGesture3D dataset and 98.43% in the RPPDI dynamic gesture dataset. PMID:26569262

  14. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. PMID:25890086

  15. Coriolis effects on rotating Hele-Shaw flows: a conformal-mapping approach.

    PubMed

    Miranda, José A; Gadêlha, Hermes; Dorsey, Alan T

    2010-12-01

    The zero surface tension fluid-fluid interface dynamics in a radial Hele-Shaw cell driven by both injection and rotation is studied by a conformal-mapping approach. The situation in which one of the fluids is inviscid and has negligible density is analyzed. When Coriolis force effects are ignored, exact solutions of the zero surface tension rotating Hele-Shaw problem with injection reveal suppression of cusp singularities for sufficiently high rotation rates. We study how the Coriolis force affects the time-dependent solutions of the problem, and the development of finite time singularities. By employing Richardson's harmonic moments approach we obtain conformal maps which describe the time evolution of the fluid boundary. Our results demonstrate that the inertial Coriolis contribution plays an important role in determining the time for cusp formation. Moreover, it introduces a phase drift that makes the evolving patterns rotate. The Coriolis force acts against centrifugal effects, promoting (inhibiting) cusp breakdown if the more viscous and dense fluid lies outside (inside) the interface. Despite the presence of Coriolis effects, the occurrence of finger bending events has not been detected in the exact solutions.

  16. Mapping CO2 emission in highly urbanized region using standardized microbial respiration approach

    NASA Astrophysics Data System (ADS)

    Vasenev, V. I.; Stoorvogel, J. J.; Ananyeva, N. D.

    2012-12-01

    Urbanization is a major recent land-use change pathway. Land conversion to urban has a tremendous and still unclear effect on soil cover and functions. Urban soil can act as a carbon source, although its potential for CO2 emission is also very high. The main challenge in analysis and mapping soil organic carbon (SOC) in urban environment is its high spatial heterogeneity and temporal dynamics. The urban environment provides a number of specific features and processes that influence soil formation and functioning and results in a unique spatial variability of carbon stocks and fluxes at short distance. Soil sealing, functional zoning, settlement age and size are the predominant factors, distinguishing heterogeneity of urban soil carbon. The combination of these factors creates a great amount of contrast clusters with abrupt borders, which is very difficult to consider in regional assessment and mapping of SOC stocks and soil CO2 emission. Most of the existing approaches to measure CO2 emission in field conditions (eddy-covariance, soil chambers) are very sensitive to soil moisture and temperature conditions. They require long-term sampling set during the season in order to obtain relevant results. This makes them inapplicable for the analysis of CO2 emission spatial variability at the regional scale. Soil respiration (SR) measurement in standardized lab conditions enables to overcome this difficulty. SR is predominant outgoing carbon flux, including autotrophic respiration of plant roots and heterotrophic respiration of soil microorganisms. Microbiota is responsible for 50-80% of total soil carbon outflow. Microbial respiration (MR) approach provides an integral CO2 emission results, characterizing microbe CO2 production in optimal conditions and thus independent from initial difference in soil temperature and moisture. The current study aimed to combine digital soil mapping (DSM) techniques with standardized microbial respiration approach in order to analyse and

  17. Repelling Point Bosons

    NASA Astrophysics Data System (ADS)

    McGuire, J. B.

    2011-12-01

    There is a body of conventional wisdom that holds that a solvable quantum problem, by virtue of its solvability, is pathological and thus irrelevant. It has been difficult to refute this view owing to the paucity of theoretical constructs and experimental results. Recent experiments involving equivalent ions trapped in a spatial conformation of extreme anisotropic confinement (longitudinal extension tens, hundreds or even thousands of times transverse extension) have modified the view of relevancy, and it is now possible to consider systems previously thought pathological, in particular point Bosons that repel in one dimension. It has been difficult for the experimentalists to utilize existing theory, mainly due to long-standing theoretical misunderstanding of the relevance of the permutation group, in particular the non-commutativity of translations (periodicity) and transpositions (permutation). This misunderstanding is most easily rectified in the case of repelling Bosons.

  18. Higgs Boson Properties

    NASA Astrophysics Data System (ADS)

    David, André Dührssen, Michael

    2016-10-01

    This chapter presents an overview of the measured properties of the Higgs boson discovered in 2012 by the ATLAS and CMS collaborations at the CERN LHC. Searches for deviations from the properties predicted by the standard theory are also summarised. The present status corresponds to the combined analysis of the full Run 1 data sets of collisions collected at centre-of-mass energies of 7 and 8 TeV.

  19. An integrated approach for updating cadastral maps in Pakistan using satellite remote sensing data

    NASA Astrophysics Data System (ADS)

    Ali, Zahir; Tuladhar, Arbind; Zevenbergen, Jaap

    2012-08-01

    Updating cadastral information is crucial for recording land ownership and property division changes in a timely fashioned manner. In most cases, the existing cadastral maps do not provide up-to-date information on land parcel boundaries. Such a situation demands that all the cadastral data and parcel boundaries information in these maps to be updated in a timely fashion. The existing techniques for acquiring cadastral information are discipline-oriented based on different disciplines such as geodesy, surveying, and photogrammetry. All these techniques require a large number of manpower, time, and cost when they are carried out separately. There is a need to integrate these techniques for acquiring cadastral information to update the existing cadastral data and (re)produce cadastral maps in an efficient manner. To reduce the time and cost involved in cadastral data acquisition, this study develops an integrated approach by integrating global position system (GPS) data, remote sensing (RS) imagery, and existing cadastral maps. For this purpose, the panchromatic image with 0.6 m spatial resolution and the corresponding multi-spectral image with 2.4 m spatial resolution and 3 spectral bands from QuickBird satellite were used. A digital elevation model (DEM) was extracted from SPOT-5 stereopairs and some ground control points (GCPs) were also used for ortho-rectifying the QuickBird images. After ortho-rectifying these images and registering the multi-spectral image to the panchromatic image, fusion between them was attained to get good quality multi-spectral images of these two study areas with 0.6 m spatial resolution. Cadastral parcel boundaries were then identified on QuickBird images of the two study areas via visual interpretation using participatory-GIS (PGIS) technique. The regions of study are the urban and rural areas of Peshawar and Swabi districts in the Khyber Pakhtunkhwa province of Pakistan. The results are the creation of updated cadastral maps with a

  20. A stochastic approach to estimate the uncertainty of dose mapping caused by uncertainties in b-spline registration

    SciTech Connect

    Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P.

    2012-04-15

    Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.

  1. Accurate multi-source forest species mapping using the multiple spectral-spatial classification approach

    NASA Astrophysics Data System (ADS)

    Stavrakoudis, Dimitris; Gitas, Ioannis; Karydas, Christos; Kolokoussis, Polychronis; Karathanassi, Vassilia

    2015-10-01

    This paper proposes an efficient methodology for combining multiple remotely sensed imagery, in order to increase the classification accuracy in complex forest species mapping tasks. The proposed scheme follows a decision fusion approach, whereby each image is first classified separately by means of a pixel-wise Fuzzy-Output Support Vector Machine (FO-SVM) classifier. Subsequently, the multiple results are fused according to the so-called multiple spectral- spatial classifier using the minimum spanning forest (MSSC-MSF) approach, which constitutes an effective post-regularization procedure for enhancing the result of a single pixel-based classification. For this purpose, the original MSSC-MSF has been extended in order to handle multiple classifications. In particular, the fuzzy outputs of the pixel-based classifiers are stacked and used to grow the MSF, whereas the markers are also determined considering both classifications. The proposed methodology has been tested on a challenging forest species mapping task in northern Greece, considering a multispectral (GeoEye) and a hyper-spectral (CASI) image. The pixel-wise classifications resulted in overall accuracies (OA) of 68.71% for the GeoEye and 77.95% for the CASI images, respectively. Both of them are characterized by high levels of speckle noise. Applying the proposed multi-source MSSC-MSF fusion, the OA climbs to 90.86%, which is attributed both to the ability of MSSC-MSF to tackle the salt-and-pepper effect, as well as the fact that the fusion approach exploits the relative advantages of both information sources.

  2. Mapping of Protein–Protein Interaction Sites by the ‘Absence of Interference’ Approach

    PubMed Central

    Dhayalan, Arunkumar; Jurkowski, Tomasz P.; Laser, Heike; Reinhardt, Richard; Jia, Da; Cheng, Xiaodong; Jeltsch, Albert

    2008-01-01

    Protein–protein interactions are critical to most biological processes, and locating protein–protein interfaces on protein structures is an important task in molecular biology. We developed a new experimental strategy called the ‘absence of interference’ approach to determine surface residues involved in protein–protein interaction of established yeast two-hybrid pairs of interacting proteins. One of the proteins is subjected to high-level randomization by error-prone PCR. The resulting library is selected by yeast two-hybrid system for interacting clones that are isolated and sequenced. The interaction region can be identified by an absence or depletion of mutations. For data analysis and presentation, we developed a Web interface that analyzes the mutational spectrum and displays the mutational frequency on the surface of the structure (or a structural model) of the randomized protein†. Additionally, this interface might be of use for the display of mutational distributions determined by other types of random mutagenesis experiments. We applied the approach to map the interface of the catalytic domain of the DNA methyltransferase Dnmt3a with its regulatory factor Dnmt3L. Dnmt3a was randomized with high mutational load. A total of 76 interacting clones were isolated and sequenced, and 648 mutations were identified. The mutational pattern allowed to identify a unique interaction region on the surface of Dnmt3a, which comprises about 500−600 Å2. The results were confirmed by site-directed mutagenesis and structural analysis. The absence-of-interference approach will allow high-throughput mapping of protein interaction sites suitable for functional studies and protein docking. PMID:18191145

  3. Chiral anomaly, bosonization, and fractional charge

    SciTech Connect

    Mignaco, J.A.; Monteiro, M.A.R.

    1985-06-15

    We present a method to evaluate the Jacobian of chiral rotations, regulating determinants through the proper-time method and using Seeley's asymptotic expansion. With this method we compute easily the chiral anomaly for ..nu.. = 4,6 dimensions, discuss bosonization of some massless two-dimensional models, and handle the problem of charge fractionization. In addition, we comment on the general validity of Fujikawa's approach to regulate the Jacobian of chiral rotations with non-Hermitian operators.

  4. The Contribution of GIS in Flood Mapping: Two Approaches Using Open Source Grass GIS Software

    NASA Astrophysics Data System (ADS)

    Marzocchi, R.; Federici, B.; Cannata, M.; Cosso, T.; Syriou, A.

    2013-01-01

    The first step of a risk assessment analysis is the evaluation of flood-prone areas. Its importance is considered for both managing and planning emergency activities, such as hydraulic risk reduction management, and also town planning. Nowadays, using GIS technology for risk assessment analysis is very common. However, it is not widely used for defining inundated areas. LiDAR data, such as Digital Elevation Models (DEM), makes GIS numerical models attractive methods for obtaining a flooded area automatically. Using GIS tools, is beneficial for effective processing and accuracy assessment in comparison to the traditional methods which are based on topographic maps and field surveys. A first approach (Federici and Sguerso, 2007; Marzocchi et al. 2009) is the use of a GIS module in order to create perifluvial flood maps, having as prerequisites (i) the conformation of the river floodplain by a high resolution DEM and (ii) a water surface profile along the river axis calculated for a given water discharge through a generic one-dimensional (1D) hydraulic model (HEC-RAS, Basement, MIKE 11, etc). On the other hand, a second approach is the use of a 2D model GIS embedded in order to create flooded areas due to a dam break (Cannata & Marzocchi, 2012). This module solves the conservative form of the 2D Shallow Water Equations (SWE) using a Finite Volume Method (FVM). The intercell flux is computed by a one-side upwind conservative scheme extended to a 2D problem (Ying et al., 2004). The new developed GIS module gives as an output maximum intensity maps which can be directly used during the risk assessment process. Both models implemented in GRASS GIS software (GRASS, 2013) and two new commands (r.inund.fluv and r.damflood) have been created. They are all available on the official GRASS website and they are distributed under the terms of the GNU General Public License (GPL). In this work we present a comparison between the two models mentioned above. We analyse the

  5. A Novel Chemical Biology Approach for Mapping of Polymyxin Lipopeptide Antibody Binding Epitopes.

    PubMed

    Velkov, Tony; Yun, Bo; Schneider, Elena K; Azad, Mohammad A K; Dolezal, Olan; Morris, Faye C; Nation, Roger L; Wang, Jiping; Chen, Ke; Yu, Heidi H; Wang, Lv; Thompson, Philip E; Roberts, Kade D; Li, Jian

    2016-05-13

    Polymyxins B and E (i.e., colistin) are a family of naturally occurring lipopeptide antibiotics that are our last line of defense against multidrug resistant (MDR) Gram-negative pathogens. Unfortunately, nephrotoxicity is a dose-limiting factor for polymyxins that limits their clinical utility. Our recent studies demonstrate that polymyxin-induced nephrotoxicity is a result of their extensive accumulation in renal tubular cells. The design and development of safer, novel polymyxin lipopeptides is hampered by our limited understanding of their complex structure-nephrotoxicity relationships. This is the first study to employ a novel targeted chemical biology approach to map the polymyxin recognition epitope of a commercially available polymyxin mAb and demonstrate its utility for mapping the kidney distribution of a novel, less nephrotoxic polymyxin lipopeptide. Eighteen novel polymyxin lipopeptide analogues were synthesized with modifications in the polymyxin core domains, namely, the N-terminal fatty acyl region, tripeptide linear segment, and cyclic heptapeptide. Surface plasmon resonance epitope mapping revealed that the monoclonal antibody (mAb) recognition epitope consisted of the hydrophobic domain (N-terminal fatty acyl and position 6/7) and diaminobutyric acid (Dab) residues at positions 3, 5, 8, and 9 of the polymyxin molecule. Structural diversity within the hydrophobic domains and Dab 3 position are tolerated. Enlightened with an understating of the structure-binding relationships between the polymyxin mAb and the core polymyxin scaffold, we can now rationally employ the mAb to probe the kidney distribution of novel polymyxin lipopeptides. This information will be vital in the design of novel, safer polymyxins through chemical tailoring of the core scaffold and exploration of the elusive/complex polymyxin structure-nephrotoxicity relationships.

  6. A Novel Chemical Biology Approach for Mapping of Polymyxin Lipopeptide Antibody Binding Epitopes.

    PubMed

    Velkov, Tony; Yun, Bo; Schneider, Elena K; Azad, Mohammad A K; Dolezal, Olan; Morris, Faye C; Nation, Roger L; Wang, Jiping; Chen, Ke; Yu, Heidi H; Wang, Lv; Thompson, Philip E; Roberts, Kade D; Li, Jian

    2016-05-13

    Polymyxins B and E (i.e., colistin) are a family of naturally occurring lipopeptide antibiotics that are our last line of defense against multidrug resistant (MDR) Gram-negative pathogens. Unfortunately, nephrotoxicity is a dose-limiting factor for polymyxins that limits their clinical utility. Our recent studies demonstrate that polymyxin-induced nephrotoxicity is a result of their extensive accumulation in renal tubular cells. The design and development of safer, novel polymyxin lipopeptides is hampered by our limited understanding of their complex structure-nephrotoxicity relationships. This is the first study to employ a novel targeted chemical biology approach to map the polymyxin recognition epitope of a commercially available polymyxin mAb and demonstrate its utility for mapping the kidney distribution of a novel, less nephrotoxic polymyxin lipopeptide. Eighteen novel polymyxin lipopeptide analogues were synthesized with modifications in the polymyxin core domains, namely, the N-terminal fatty acyl region, tripeptide linear segment, and cyclic heptapeptide. Surface plasmon resonance epitope mapping revealed that the monoclonal antibody (mAb) recognition epitope consisted of the hydrophobic domain (N-terminal fatty acyl and position 6/7) and diaminobutyric acid (Dab) residues at positions 3, 5, 8, and 9 of the polymyxin molecule. Structural diversity within the hydrophobic domains and Dab 3 position are tolerated. Enlightened with an understating of the structure-binding relationships between the polymyxin mAb and the core polymyxin scaffold, we can now rationally employ the mAb to probe the kidney distribution of novel polymyxin lipopeptides. This information will be vital in the design of novel, safer polymyxins through chemical tailoring of the core scaffold and exploration of the elusive/complex polymyxin structure-nephrotoxicity relationships. PMID:27627202

  7. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  8. Interacting boson models for N{approx}Z nuclei

    SciTech Connect

    Van Isacker, P.

    2011-05-06

    This contribution discusses the use of boson models in the description of N{approx}Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N{approx}Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g{sub 9/2} shell.

  9. Interacting boson models for N˜Z nuclei

    NASA Astrophysics Data System (ADS)

    Van Isacker, P.

    2011-05-01

    This contribution discusses the use of boson models in the description of N˜Z nuclei. A brief review is given of earlier attempts, initiated by Elliott and co-workers, to extend the interacting boson model of Arima and Iachello by the inclusion of neutron-proton s and d bosons with T = 1 (IBM-3) as well as T = 0 (IBM-4). It is argued that for the N˜Z nuclei that are currently studied experimentally, a different approach is needed which invokes aligned neutron-proton pairs with angular momentum J = 2j and isospin T = 0. This claim is supported by an analysis of shell-model wave functions in terms of pair states. Results of this alternative version of the interacting boson model are compared with shell-model calculations in the 1g9/2 shell.

  10. Object-based approach to national land cover mapping using HJ satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Xiaosong; Yuan, Quanzhi; Liu, Yu

    2014-01-01

    To meet the carbon storage estimate in ecosystems for a national carbon strategy, we introduce a consistent database of China land cover. The Chinese Huan Jing (HJ) satellite is proven efficient in the cloud-free acquisition of seasonal image series in a monsoon region and in vegetation identification for mesoscale land cover mapping. Thirty-eight classes of level II land cover are generated based on the Land Cover Classification System of the United Nations Food and Agriculture Organization that follows a standard and quantitative definition. Twenty-four layers of derivative spectral, environmental, and spatial features compose the classification database. Object-based approach characterizing additional nonspectral features is conducted through mapping, and multiscale segmentations are applied on object boundary match to target real-world conditions. This method sufficiently employs spatial information, in addition to spectral characteristics, to improve classification accuracy. The algorithm of hierarchical classification is employed to follow step-by-step procedures that effectively control classification quality. This algorithm divides the dual structures of universal and local trees. Consistent universal trees suitable to most regions are performed first, followed by local trees that depend on specific features of nine climate stratifications. The independent validation indicates the overall accuracy reaches 86%.

  11. An efficient unsupervised index based approach for mapping urban vegetation from IKONOS imagery

    NASA Astrophysics Data System (ADS)

    Anchang, Julius Y.; Ananga, Erick O.; Pu, Ruiliang

    2016-08-01

    Despite the increased availability of high resolution satellite image data, their operational use for mapping urban land cover in Sub-Saharan Africa continues to be limited by lack of computational resources and technical expertise. As such, there is need for simple and efficient image classification techniques. Using Bamenda in North West Cameroon as a test case, we investigated two completely unsupervised pixel based approaches to extract tree/shrub (TS) and ground vegetation (GV) cover from an IKONOS derived soil adjusted vegetation index. These included: (1) a simple Jenks Natural Breaks classification and (2) a two-step technique that combined the Jenks algorithm with agglomerative hierarchical clustering. Both techniques were compared with each other and with a non-linear support vector machine (SVM) for classification performance. While overall classification accuracy was generally high for all techniques (>90%), One-Way Analysis of Variance tests revealed the two step technique to outperform the simple Jenks classification in terms of predicting the GV class. It also outperformed the SVM in predicting the TS class. We conclude that the unsupervised methods are technically as good and practically superior for efficient urban vegetation mapping in budget and technically constrained regions such as Sub-Saharan Africa.

  12. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms

    PubMed Central

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting

    2016-01-01

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus, which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge. PMID:27660763

  13. An Approach for Predicting Essential Genes Using Multiple Homology Mapping and Machine Learning Algorithms

    PubMed Central

    Hua, Hong-Li; Zhang, Fa-Zhan; Labena, Abraham Alemayehu; Dong, Chuan; Jin, Yan-Ting

    2016-01-01

    Investigation of essential genes is significant to comprehend the minimal gene sets of cell and discover potential drug targets. In this study, a novel approach based on multiple homology mapping and machine learning method was introduced to predict essential genes. We focused on 25 bacteria which have characterized essential genes. The predictions yielded the highest area under receiver operating characteristic (ROC) curve (AUC) of 0.9716 through tenfold cross-validation test. Proper features were utilized to construct models to make predictions in distantly related bacteria. The accuracy of predictions was evaluated via the consistency of predictions and known essential genes of target species. The highest AUC of 0.9552 and average AUC of 0.8314 were achieved when making predictions across organisms. An independent dataset from Synechococcus elongatus, which was released recently, was obtained for further assessment of the performance of our model. The AUC score of predictions is 0.7855, which is higher than other methods. This research presents that features obtained by homology mapping uniquely can achieve quite great or even better results than those integrated features. Meanwhile, the work indicates that machine learning-based method can assign more efficient weight coefficients than using empirical formula based on biological knowledge.

  14. Mapping the progress and impacts of public health approaches to palliative care: a scoping review protocol

    PubMed Central

    Archibald, Daryll; Patterson, Rebecca; Haraldsdottir, Erna; Hazelwood, Mark; Fife, Shirley; Murray, Scott A

    2016-01-01

    Introduction Public health palliative care is a term that can be used to encompass a variety of approaches that involve working with communities to improve people's experience of death, dying and bereavement. Recently, public health palliative care approaches have gained recognition and momentum within UK health policy and palliative care services. There is general consensus that public health palliative care approaches can complement and go beyond the scope of formal service models of palliative care. However, there is no clarity about how these approaches can be undertaken in practice or how evidence can be gathered relating to their effectiveness. Here we outline a scoping review protocol that will systematically map and categorise the variety of activities and programmes that could be classified under the umbrella term ‘public health palliative care’ and highlight the impact of these activities where measured. Methods and analysis This review will be guided by Arksey and O'Malley's scoping review methodology and incorporate insights from more recent innovations in scoping review methodology. Sensitive searches of 9 electronic databases from 1999 to 2016 will be supplemented by grey literature searches. Eligible studies will be screened independently by two reviewers using a data charting tool developed for this scoping review. Ethics and dissemination This scoping review will undertake a secondary analysis of data already collected and does not require ethical approval. The results will facilitate better understanding of the practical application of public health approaches to palliative care, the impacts these activities can have and how to build the evidence base for this work in future. The results will be disseminated through traditional academic routes such as conferences and journals and also policy and third sector seminars. PMID:27417201

  15. Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction.

    PubMed

    Saunders, Benjamin T; Richard, Jocelyn M; Janak, Patricia H

    2015-09-19

    Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits.

  16. Mind-mapping for lung cancer: Towards a personalized therapeutics approach

    PubMed Central

    Mollberg, N; Surati, M; Demchuk, C; Fathi, R; Salama, AK; Husain, AN; Hensing, T; Salgia, R

    2011-01-01

    There will be over 220,000 people diagnosed with lung cancer and over 160,000 dying of lung cancer this year alone in the United States. In order to arrive at better control, prevention, diagnosis, and therapeutics for lung cancer, we must be able to personalize the approach towards lung cancer. Mind-mapping has existed for centuries for physicians to properly think about various “flows” of personalized medicine. We include here the epidemiology, diagnosis, histology, and treatment of lung cancer—specifically, non-small cell lung cancer. As we have new molecular signatures for lung cancer, this is further detailed. This review is not meant to be a comprehensive review, but rather its purpose is to highlight important aspects of lung cancer diagnosis, management, and personalized treatment options. PMID:21337123

  17. Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction

    PubMed Central

    Saunders, Benjamin T.; Richard, Jocelyn M.; Janak, Patricia H.

    2015-01-01

    Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits. PMID:26240425

  18. A universal airborne LiDAR approach for tropical forest carbon mapping.

    PubMed

    Asner, Gregory P; Mascaro, Joseph; Muller-Landau, Helene C; Vieilledent, Ghislain; Vaudry, Romuald; Rasamoelina, Maminiaina; Hall, Jefferson S; van Breugel, Michiel

    2012-04-01

    Airborne light detection and ranging (LiDAR) is fast turning the corner from demonstration technology to a key tool for assessing carbon stocks in tropical forests. With its ability to penetrate tropical forest canopies and detect three-dimensional forest structure, LiDAR may prove to be a major component of international strategies to measure and account for carbon emissions from and uptake by tropical forests. To date, however, basic ecological information such as height-diameter allometry and stand-level wood density have not been mechanistically incorporated into methods for mapping forest carbon at regional and global scales. A better incorporation of these structural patterns in forests may reduce the considerable time needed to calibrate airborne data with ground-based forest inventory plots, which presently necessitate exhaustive measurements of tree diameters and heights, as well as tree identifications for wood density estimation. Here, we develop a new approach that can facilitate rapid LiDAR calibration with minimal field data. Throughout four tropical regions (Panama, Peru, Madagascar, and Hawaii), we were able to predict aboveground carbon density estimated in field inventory plots using a single universal LiDAR model (r ( 2 ) = 0.80, RMSE = 27.6 Mg C ha(-1)). This model is comparable in predictive power to locally calibrated models, but relies on limited inputs of basal area and wood density information for a given region, rather than on traditional plot inventories. With this approach, we propose to radically decrease the time required to calibrate airborne LiDAR data and thus increase the output of high-resolution carbon maps, supporting tropical forest conservation and climate mitigation policy.

  19. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    NASA Astrophysics Data System (ADS)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts

  20. An Improved Approach for Mapping Quantitative Trait loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Wullschleger, Stan D; Wu, Song; Wu, Rongling; Yang, Jie; Li, Yao; Yin, Tongming; Tuskan, Gerald A

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  1. An Improved Approach for Mapping Quantitative Trait Loci in a Pseudo-Testcross: Revisiting a Poplar Mapping Study

    SciTech Connect

    Tuskan, Gerald A; Yin, Tongming; Wullschleger, Stan D; Yang, Jie; Huang, Youjun; Li, Yao; Wu, Rongling

    2010-01-01

    A pseudo-testcross pedigree is widely used for mapping quantitative trait loci (QTL) in outcrossing species, but the model for analyzing pseudo-testcross data borrowed from the inbred backcross design can only detect those QTLs that are heterozygous only in one parent. In this study, an intercross model that incorporates the high heterozygosity and phase uncertainty of outcrossing species was used to reanalyze a published data set on QTL mapping in poplar trees. Several intercross QTLs that are heterozygous in both parents were detected, which are responsible not only for biomass traits, but also for their genetic correlations. This study provides a more complete identification of QTLs responsible for economically important biomass traits in poplars.

  2. Multi-boson production

    SciTech Connect

    Mastrandrea, Paolo; /Fermilab

    2010-09-01

    The studies of the diboson production in p{bar p} collisions at 1.96 TeV performed by CDF and D0 collaborations at the Tevatron collider are reported in this paper. The diboson events are identified by means of both leptonic and semi-leptonic final states. The presented results use different statistical samples collected by the Tevatron up to 4.8 fb{sup -1}. Measured production cross sections are in good agreement with Standard Model predictions and the limits on the anomalous triple gauge boson couplings are competitive with those measured by experiments at the Large Electron-Positron collider (LEP).

  3. Approximate gauge symemtry of composite vector bosons

    SciTech Connect

    Suzuki, Mahiko

    2010-06-01

    It can be shown in a solvable field theory model that the couplings of the composite vector mesons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in more an intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  4. Interhemispheric transfalcine approach and awake cortical mapping for resection of peri-atrial gliomas associated with the central lobule.

    PubMed

    Malekpour, Mahdi; Cohen-Gadol, Aaron A

    2015-02-01

    Medial posterior frontal and parietal gliomas extending to the peri-atrial region are difficult to reach surgically because of the working angle required to expose the lateral aspect of the tumor and the proximity of the tumor to the sensorimotor lobule; retraction of the sensorimotor cortex may lead to morbidity. The interhemispheric transfalcine approach is favorable and safe for resection of medial hemispheric tumors adjacent to the falx cerebri, but the literature on this approach is scarce. Awake cortical mapping using this operative route for tumors associated with the sensorimotor cortex has not been previously reported to our knowledge. We present the first case of a right medial posterior frontoparietal oligoastrocytoma that was resected through the interhemispheric transfalcine approach using awake cortical and subcortical mapping. Through a contralateral frontoparietal craniotomy, we excised a section of the falx and exposed the contralateral medial hemisphere. Cortical stimulation allowed localization of the supplementary motor cortex, and suprathreshold stimulation mapping excluded the primary motor cortex corresponding to the leg area. Gross total tumor resection was accomplished without any intraoperative or postoperative deficits. Awake cortical mapping using the contralateral transfalcine approach allows a "cross-court" operative route to map functional cortices and resect peri-atrial low-grade gliomas. This technique can minimize the otherwise necessary retraction on the ipsilateral hemisphere through an ipsilateral craniotomy.

  5. A Stochastic Simulation Framework for the Prediction of Strategic Noise Mapping and Occupational Noise Exposure Using the Random Walk Approach

    PubMed Central

    Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019

  6. A stochastic simulation framework for the prediction of strategic noise mapping and occupational noise exposure using the random walk approach.

    PubMed

    Han, Lim Ming; Haron, Zaiton; Yahya, Khairulzan; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri

    2015-01-01

    Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces.

  7. A new approach of mapping soils in the Alps - Challenges of deriving soil information and creating soil maps for sustainable land use. An example from South Tyrol (Italy)

    NASA Astrophysics Data System (ADS)

    Baruck, Jasmin; Gruber, Fabian E.; Geitner, Clemens

    2015-04-01

    Nowadays sustainable land use management is gaining importance because intensive land use leads to increasing soil degradation. Especially in mountainous regions like the Alps sustainable land use management is important, as topography limits land use. Therefore, a database containing detailed information of soil characteristics is required. However, information of soil properties is far from being comprehensive. The project "ReBo - Terrain classification based on airborne laser scanning data to support soil mapping in the Alps", founded by the Autonomous Province of Bolzano, aims at developing a methodical framework of how to obtain soil data. The approach combines geomorphometric analysis and soil mapping to generate modern soil maps at medium-scale in a time and cost efficient way. In this study the open source GRASS GIS extension module r.geomorphon (Jasciewicz and Stepinski, 2013) is used to derive topographically homogeneous landform units out of high resolution DTMs on scale 1:5.000. Furthermore, for terrain segmentation and classification we additionally use medium-scale data sets (geology, parent material, land use etc.). As the Alps are characterized by a great variety of topography, parent material, wide range of moisture regimes etc. getting reliable soil data is difficult. Additionally, geomorphic activity (debris flow, landslide etc.) leads to natural disturbances. Thus, soil properties are highly diverse and largely scale dependent. Furthermore, getting soil information of anthropogenically influenced soils is an added challenge. Due to intensive cultivation techniques the natural link between the soil forming factors is often repealed. In South Tyrol we find the largest pome producing area in Europe. Normally, the annual precipitation is not enough for intensive orcharding. Thus, irrigation strategies are in use. However, as knowledge about the small scaled heterogeneous soil properties is mostly lacking, overwatering and modifications of the

  8. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches

    PubMed Central

    Ma, Ying; Shaik, Mohammed A.; Kozberg, Mariel G.; Thibodeaux, David N.; Zhao, Hanzhi T.; Yu, Hang

    2016-01-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574312

  9. Identifying Potential Areas for Siting Interim Nuclear Waste Facilities Using Map Algebra and Optimization Approaches

    SciTech Connect

    Omitaomu, Olufemi A; Liu, Cheng; Cetiner, Sacit M; Belles, Randy; Mays, Gary T; Tuttle, Mark A

    2013-01-01

    The renewed interest in siting new nuclear power plants in the United States has brought to the center stage, the need to site interim facilities for long-term management of spent nuclear fuel (SNF). In this paper, a two-stage approach for identifying potential areas for siting interim SNF facilities is presented. In the first stage, the land area is discretized into grids of uniform size (e.g., 100m x 100m grids). For the continental United States, this process resulted in a data matrix of about 700 million cells. Each cell of the matrix is then characterized as a binary decision variable to indicate whether an exclusion criterion is satisfied or not. A binary data matrix is created for each of the 25 siting criteria considered in this study. Using map algebra approach, cells that satisfy all criteria are clustered and regarded as potential siting areas. In the second stage, an optimization problem is formulated as a p-median problem on a rail network such that the sum of the shortest distance between nuclear power plants with SNF and the potential storage sites from the first stage is minimized. The implications of obtained results for energy policies are presented and discussed.

  10. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches.

    PubMed

    Ma, Ying; Shaik, Mohammed A; Kim, Sharon H; Kozberg, Mariel G; Thibodeaux, David N; Zhao, Hanzhi T; Yu, Hang; Hillman, Elizabeth M C

    2016-10-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'. PMID:27574312

  11. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches.

    PubMed

    Ma, Ying; Shaik, Mohammed A; Kim, Sharon H; Kozberg, Mariel G; Thibodeaux, David N; Zhao, Hanzhi T; Yu, Hang; Hillman, Elizabeth M C

    2016-10-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results.This article is part of the themed issue 'Interpreting BOLD: a dialogue between cognitive and cellular neuroscience'.

  12. Multistate boson stars

    SciTech Connect

    Bernal, A.; Barranco, J.; Alic, D.; Palenzuela, C.

    2010-02-15

    Motivated by the increasing interest in models which consider scalar fields as viable dark matter candidates, we have constructed a generalization of relativistic boson stars (BS) composed of two coexisting states of the scalar field, the ground state and the first excited state. We have studied the dynamical evolution of these multistate boson stars (MSBS) under radial perturbations, using numerical techniques. We show that stable MSBS can be constructed, when the number of particles in the first excited state, N{sup (2)}, is smaller than the number of particles in the ground state, N{sup (1)}. On the other hand, when N{sup (2)}>N{sup (1)}, the configurations are initially unstable. However, they evolve and settle down into stable configurations. In the stabilization process, the initially ground state is excited and ends in a first excited state, whereas the initially first excited state ends in a ground state. During this process, both states emit scalar field radiation, decreasing their number of particles. This behavior shows that even though BS in the first excited state are intrinsically unstable under finite perturbations, the configuration resulting from the combination of this state with the ground state produces stable objects. Finally we show in a qualitative way, that stable MSBS could be realistic models of dark matter galactic halos, as they produce rotation curves that are flatter at large radii than the rotation curves produced by BS with only one state.

  13. Geometric phases and quantum correlations dynamics in spin-boson model

    SciTech Connect

    Wu, Wei; Xu, Jing-Bo

    2014-01-28

    We explore the dynamics of spin-boson model for the Ohmic bath by employing the master equation approach and obtain an explicit expression of reduced density matrix. We also calculate the geometric phases of the spin-boson model by making use of the analytical results and discuss how the dissipative bosonic environment affects geometric phases. Furthermore, we investigate the dynamics of quantum discord and entanglement of two qubits each locally interacting with its own independent bosonic environments. It is found that the decay properties of quantum discord and entanglement are sensitive to the choice of initial state's parameter and coupling strength between system and bath.

  14. Sequencing the Pig Genome Using a Mapped BAC by BAC Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have generated a highly contiguous physical map covering >98% of the pig genome in just 176 contigs. The map is localised to the genome through integration with the UIUC RH map as well BAC end sequence alignments to the human genome. Over 265k HindIII restriction digest fingerprints totalling 1...

  15. The influence of mapped hazards on risk beliefs: A proximity-based modeling approach

    PubMed Central

    Severtson, Dolores J.; Burt, James E.

    2013-01-01

    Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers' perceived or actual map locations, we present a model to estimate proximity-based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated “you live here” location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, e.g. distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples. PMID:22053748

  16. Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...

  17. Mapping quantitative trait loci in selected breeding populations: A segregation distortion approach.

    PubMed

    Cui, Y; Zhang, F; Xu, J; Li, Z; Xu, S

    2015-12-01

    Quantitative trait locus (QTL) mapping is often conducted in line-crossing experiments where a sample of individuals is randomly selected from a pool of all potential progeny. QTLs detected from such an experiment are important for us to understand the genetic mechanisms governing a complex trait, but may not be directly relevant to plant breeding if they are not detected from the breeding population where selection is targeting for. QTLs segregating in one population may not necessarily segregate in another population. To facilitate marker-assisted selection, QTLs must be detected from the very population which the selection is targeting. However, selected breeding populations often have depleted genetic variation with small population sizes, resulting in low power in detecting useful QTLs. On the other hand, if selection is effective, loci controlling the selected trait will deviate from the expected Mendelian segregation ratio. In this study, we proposed to detect QTLs in selected breeding populations via the detection of marker segregation distortion in either a single population or multiple populations using the same selection scheme. Simulation studies showed that QTL can be detected in strong selected populations with selected population sizes as small as 25 plants. We applied the new method to detect QTLs in two breeding populations of rice selected for high grain yield. Seven QTLs were identified, four of which have been validated in advanced generations in a follow-up study. Cloned genes in the vicinity of the four QTLs were also reported in the literatures. This mapping-by-selection approach provides a new avenue for breeders to improve breeding progress. The new method can be applied to breeding programs not only in rice but also in other agricultural species including crops, trees and animals.

  18. Towards a virtual hub approach for landscape assessment and multimedia ecomuseum using multitemporal-maps

    NASA Astrophysics Data System (ADS)

    Brumana, R.; Santana Quintero, M.; Barazzetti, L.; Previtali, M.; Banfi, F.; Oreni, D.; Roels, D.; Roncoroni, F.

    2015-08-01

    Landscapes are dynamic entities, stretching and transforming across space and time, and need to be safeguarded as living places for the future, with interaction of human, social and economic dimensions. To have a comprehensive landscape evaluation several open data are needed, each one characterized by its own protocol, service interface, limiting or impeding this way interoperability and their integration. Indeed, nowadays the development of websites targeted to landscape assessment and touristic purposes requires many resources in terms of time, cost and IT skills to be implemented at different scales. For this reason these applications are limited to few cases mainly focusing on worldwide known touristic sites. The capability to spread the development of web-based multimedia virtual museum based on geospatial data relies for the future being on the possibility to discover the needed geo-spatial data through a single point of access in an homogenous way. In this paper the proposed innovative approach may facilitate the access to open data in a homogeneous way by means of specific components (the brokers) performing interoperability actions required to interconnect heterogeneous data sources. In the specific case study here analysed it has been implemented an interface to migrate a geo-swat chart based on local and regional geographic information into an user friendly Google Earth©-based infrastructure, integrating ancient cadastres and modern cartography, accessible by professionals and tourists via web and also via portable devices like tables and smartphones. The general aim of this work on the case study on the Lake of Como (Tremezzina municipality), is to boost the integration of assessment methodologies with digital geo-based technologies of map correlation for the multimedia ecomuseum system accessible via web. The developed WebGIS system integrates multi-scale and multi-temporal maps with different information (cultural, historical, landscape levels

  19. Evolution of syncarpy and other morphological characters in African Annonaceae: a posterior mapping approach.

    PubMed

    Couvreur, T L P; Richardson, J E; Sosef, M S M; Erkens, R H J; Chatrou, L W

    2008-04-01

    The congenital fusion of carpels, or syncarpy, is considered a key innovation as it is found in more than 80% of angiosperms. Within the magnoliids however, syncarpy has rarely evolved. Two alternative evolutionary origins of syncarpy were suggested in order to explain the evolution of this feature: multiplication of a single carpel vs. fusion of a moderate number of carpels. The magnoliid family Annonaceae provides an ideal situation to test these hypotheses as two African genera, Isolona and Monodora, are syncarpous in an otherwise apocarpous family with multicarpellate and unicarpellate genera. In addition to syncarpy, the evolution of six other morphological characters was studied. Well-supported phylogenetic relationships of African Annonaceae and in particular those of Isolona and Monodora were reconstructed. Six plastid regions were sequenced and analyzed using maximum parsimony and Bayesian inference methods. The Bayesian posterior mapping approach to study character evolution was used as it accounts for both mapping and phylogenetic uncertainty, and also allows multiple state changes along the branches. Our phylogenetic analyses recovered a fully resolved clade comprising twelve genera endemic to Africa, including Isolona and Monodora, which was nested within the so-called long-branch clade. This is the largest and most species-rich clade of African genera identified to date within Annonaceae. The two syncarpous genera were inferred with maximum support to be sister to a clade characterized by genera with multicarpellate apocarpous gynoecia, supporting the hypothesis that syncarpy arose by fusion of a moderate number of carpels. This hypothesis was also favoured when studying the floral anatomy of both genera. Annonaceae provide the only case of a clear evolution of syncarpy within an otherwise apocarpous magnoliid family. The results presented here offer a better understanding of the evolution of syncarpy in Annonaceae and within angiosperms in general.

  20. Mapping behavioral landscapes for animal movement: a finite mixture modeling approach

    USGS Publications Warehouse

    Tracey, Jeff A.; Zhu, Jun; Boydston, Erin E.; Lyren, Lisa M.; Fisher, Robert N.; Crooks, Kevin R.

    2013-01-01

    Because of its role in many ecological processes, movement of animals in response to landscape features is an important subject in ecology and conservation biology. In this paper, we develop models of animal movement in relation to objects or fields in a landscape. We take a finite mixture modeling approach in which the component densities are conceptually related to different choices for movement in response to a landscape feature, and the mixing proportions are related to the probability of selecting each response as a function of one or more covariates. We combine particle swarm optimization and an Expectation-Maximization (EM) algorithm to obtain maximum likelihood estimates of the model parameters. We use this approach to analyze data for movement of three bobcats in relation to urban areas in southern California, USA. A behavioral interpretation of the models revealed similarities and differences in bobcat movement response to urbanization. All three bobcats avoided urbanization by moving either parallel to urban boundaries or toward less urban areas as the proportion of urban land cover in the surrounding area increased. However, one bobcat, a male with a dispersal-like large-scale movement pattern, avoided urbanization at lower densities and responded strictly by moving parallel to the urban edge. The other two bobcats, which were both residents and occupied similar geographic areas, avoided urban areas using a combination of movements parallel to the urban edge and movement toward areas of less urbanization. However, the resident female appeared to exhibit greater repulsion at lower levels of urbanization than the resident male, consistent with empirical observations of bobcats in southern California. Using the parameterized finite mixture models, we mapped behavioral states to geographic space, creating a representation of a behavioral landscape. This approach can provide guidance for conservation planning based on analysis of animal movement data using

  1. Mapping Natural Terroir Units using a multivariate approach and legacy data

    NASA Astrophysics Data System (ADS)

    Priori, Simone; Barbetti, Roberto; L'Abate, Giovanni; Bucelli, Piero; Storchi, Paolo; Costantini, Edoardo A. C.

    2014-05-01

    was then subdivided into 9 NTUs, statistically differentiated for the used variables. The vineyard areas of Siena province was subdivided into 9 NTU, statistically differentiated for the used variables. The study demonstrated the strength of a multivariate approach for NTU mapping at province scale (1:125,000), using viticultural legacy data. Identification and mapping of terroir diversity within the DOC and DOCG at the province scale suggest the adoption of viticultural subzones. The subzones, based on the NTU, could bring to the fruition of different wine-production systems that enhanced the peculiarities of the terroir.

  2. Use of mapping and spatial and space-time modeling approaches in operational control of Aedes aegypti and dengue.

    PubMed

    Eisen, Lars; Lozano-Fuentes, Saul

    2009-01-01

    The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk. PMID:19399163

  3. Discovering Higgs Bosons of the MSSM using Jet Substructure

    SciTech Connect

    Kribs, Graham D.; Martin, Adam; Roy, Tuhin S.; Spannowsky, Michael

    2010-06-01

    We present a qualitatively new approach to discover Higgs bosons of the MSSM at the LHC using jet substructure techniques applied to boosted Higgs decays. These techniques are ideally suited to the MSSM, since the lightest Higgs boson overwhelmingly decays to b{bar b} throughout the entire parameter space, while the heavier neutral Higgs bosons, if light enough to be produced in a cascade, also predominantly decay to b{bar b}. The Higgs production we consider arises from superpartner production where superpartners cascade decay into Higgs bosons. We study this mode of Higgs production for several superpartner hierarchies: m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}},{sub {tilde B}} > m{sub h} + {mu}; m{tilde q};m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}},{sub {tilde B}} > m {sub h,H,A} + {mu}; and m{sub {tilde q}},m{sub {tilde g}} > m{sub {tilde W}} > m{sub h} + {mu} with m{sub {tilde B}} {approx} {mu}. In these cascades, the Higgs bosons are boosted, with pT > 200 GeV a large fraction of the time. Since Higgs bosons appear in cascades originating from squarks and/or gluinos, the cross section for events with at least one Higgs boson can be the same order as squark/gluino production. Given 10 fb{sup -1} of 14 TeV LHC data, with m{sub {tilde q}} {approx}< 1 TeV, and one of the above superpartner mass hierarchies, our estimate of S{radical} B of the Higgs signal is sufficiently high that the b{bar b} mode can become the discovery mode of the lightest Higgs boson of the MSSM.

  4. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    PubMed Central

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map. PMID:26742857

  5. A new physical mapping approach refines the sex-determining gene positions on the Silene latifolia Y-chromosome

    NASA Astrophysics Data System (ADS)

    Kazama, Yusuke; Ishii, Kotaro; Aonuma, Wataru; Ikeda, Tokihiro; Kawamoto, Hiroki; Koizumi, Ayako; Filatov, Dmitry A.; Chibalina, Margarita; Bergero, Roberta; Charlesworth, Deborah; Abe, Tomoko; Kawano, Shigeyuki

    2016-01-01

    Sex chromosomes are particularly interesting regions of the genome for both molecular genetics and evolutionary studies; yet, for most species, we lack basic information, such as the gene order along the chromosome. Because they lack recombination, Y-linked genes cannot be mapped genetically, leaving physical mapping as the only option for establishing the extent of synteny and homology with the X chromosome. Here, we developed a novel and general method for deletion mapping of non-recombining regions by solving “the travelling salesman problem”, and evaluate its accuracy using simulated datasets. Unlike the existing radiation hybrid approach, this method allows us to combine deletion mutants from different experiments and sources. We applied our method to a set of newly generated deletion mutants in the dioecious plant Silene latifolia and refined the locations of the sex-determining loci on its Y chromosome map.

  6. Non-linear dynamics of operant behavior: a new approach via the extended return map.

    PubMed

    Li, Jay-Shake; Huston, Joseph P

    2002-01-01

    Previous efforts to apply non-linear dynamic tools to the analysis of operant behavior revealed some promise for this kind of approach, but also some doubts, since the complexity of animal behavior seemed to be beyond the analyzing ability of the available tools. We here outline a series of studies based on a novel approach. We modified the so-called 'return map' and developed a new method, the 'extended return map' (ERM) to extract information from the highly irregular time series data, the inter-response time (IRT) generated by Skinner-box experiments. We applied the ERM to operant lever pressing data from rats using the four fundamental reinforcement schedules: fixed interval (FI), fixed ratio (FR), variable interval (VI) and variable ratio (VR). Our results revealed interesting patterns in all experiment groups. In particular, the FI and VI groups exhibited well-organized clusters of data points. We calculated the fractal dimension out of these patterns and compared experimental data with surrogate data sets, that were generated by randomly shuffling the sequential order of original IRTs. This comparison supported the finding that patterns in ERM reflect the dynamics of the operant behaviors under study. We then built two models to simulate the functional mechanisms of the FI schedule. Both models can produce similar distributions of IRTs and the stereotypical 'scalloped' curve characteristic of FI responding. However, they differ in one important feature in their formulation: while one model uses a continuous function to describe the probability of occurrence of an operant behavior, the other one employs an abrupt switch of behavioral state. Comparison of ERMs showed that only the latter was able to produce patterns similar to the experimental results, indicative of the operation of an abrupt switch from one behavioral state to another over the course of the inter-reinforcement period. This example demonstrated the ERM to be a useful tool for the analysis of

  7. Mapping irrigation potential from renewable groundwater in Africa - a quantitative hydrological approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Y.; Villholth, K. G.

    2015-02-01

    Groundwater provides an important buffer to climate variability in Africa. Yet, groundwater irrigation contributes only a relatively small share of cultivated land, approximately 1% (about 2 × 106 hectares) as compared to 14% in Asia. While groundwater is over-exploited for irrigation in many parts in Asia, previous assessments indicate an underutilized potential in parts of Africa. As opposed to previous country-based estimates, this paper derives a continent-wide, distributed (0.5° spatial resolution) map of groundwater irrigation potential, indicated in terms of fractions of cropland potentially irrigable with renewable groundwater. The method builds on an annual groundwater balance approach using 41 years of hydrological data, allocating only that fraction of groundwater recharge that is in excess after satisfying other present human needs and environmental requirements, while disregarding socio-economic and physical constraints in access to the resource. Due to high uncertainty of groundwater environmental needs, three scenarios, leaving 30, 50 and 70% of recharge for the environment, were implemented. Current dominating crops and cropping rotations and associated irrigation requirements in a zonal approach were applied in order to convert recharge excess to potential irrigated cropland. Results show an inhomogeneously distributed groundwater irrigation potential across the continent, even within individual countries, mainly reflecting recharge patterns and presence or absence of cultivated cropland. Results further show that average annual renewable groundwater availability for irrigation ranges from 692 to 1644 km3 depending on scenario. The total area of cropland irrigable with renewable groundwater ranges from 44.6 to 105.3 × 106 ha, corresponding to 20.5 to 48.6% of the cropland over the continent. In particular, significant potential exists in the semi-arid Sahel and eastern African regions which could support poverty alleviation if developed

  8. Crater Mapping in the Pluto-Charon System: Considerations, Approach, and Progress

    NASA Astrophysics Data System (ADS)

    Robbins, S. J.; Singer, K. N.; Bray, V. J.; Schenk, P.; Zangari, A. M.; McKinnon, W. B.; Young, L. A.; Runyon, K. D.; Beyer, R. A.; Porter, S.; Lauer, T.; Weaver, H. A., Jr.; Olkin, C.; Ennico Smith, K.; Stern, A.

    2015-12-01

    NASA's New Horizons mission successfully made its closest approach to Pluto on July 14, 2015, at 11:49A.M. UTC. The flyby nature of the mission, distance to the system, and multiple planetary bodies to observe with a diverse instrument set required a complex imaging campaign marked by numerous trade-offs; these lead to a more complicated crater population mapping than a basic orbital mission. The Pluto and Charon imaging campaigns were full-disk or mosaics of the full disk until ≈3.5 hrs before closest approach when the pixel scale was 0.9 km/px. After this, several LORRI-specific imaging campaigns were conducted of the partial disk and later the full crescent, while additional strips were ride-alongs with other instruments. These should supply partial coverage at up to 70-80 m/px for Pluto and 160 m/px for Charon. The LORRI coverage at ≈0.4 km/px does not cover the entire encounter hemisphere, but the MVIC instrument provided comparable full-disk coverage (0.5 km/px) and partial disk at 0.3 km/px. The best images of the non-encounter hemispheres of Pluto and Charon are ≈21 km/px (taken midnight July 10-11). As with any single flyby mission, we are constrained by the best pixel scales and incidence angles at which images were taken during the flyby. While most high-resolution imaging by quantity has been done over areas of variable solar incidence as the spacecraft passed by Pluto and Charon, these cover a relatively small fraction of the bodies and most coverage has been at near-noon sun which makes crater identification difficult. Numerous team members are independently using a variety of crater mapping tools and image products, which will be reconciled and merged to make a more robust final database. We will present our consensus crater database to-date of both plutonian and charonian impact craters as well as correlations with preliminary geologic units. We will also discuss how the crater population compares with predictions and modeled Kuiper Belt

  9. An approach for mapping large-area impervious surfaces: Synergistic use of Landsat-7 ETM+ and high spatial resolution imagery

    USGS Publications Warehouse

    Yang, L.; Huang, C.; Homer, C.G.; Wylie, B.K.; Coan, M.J.

    2003-01-01

    A wide range of urban ecosystem studies, including urban hydrology, urban climate, land use planning, and resource management, require current and accurate geospatial data of urban impervious surfaces. We developed an approach to quantify urban impervious surfaces as a continuous variable by using multisensor and multisource datasets. Subpixel percent impervious surfaces at 30-m resolution were mapped using a regression tree model. The utility, practicality, and affordability of the proposed method for large-area imperviousness mapping were tested over three spatial scales (Sioux Falls, South Dakota, Richmond, Virginia, and the Chesapeake Bay areas of the United States). Average error of predicted versus actual percent impervious surface ranged from 8.8 to 11.4%, with correlation coefficients from 0.82 to 0.91. The approach is being implemented to map impervious surfaces for the entire United States as one of the major components of the circa 2000 national land cover database.

  10. Higgs in bosonic channels (CMS)

    NASA Astrophysics Data System (ADS)

    Gori, Valentina

    2015-05-01

    The main Higgs boson decays into bosonic channels will be considered, presenting and discussing results from the latest reprocessing of data collected by the CMS experiment at the LHC, using the full dataset recorded at centre-of-mass energies of 7 and 8 TeV. For this purpose, results from the final Run-I papers for the H → ZZ → 4ℓ, H → γγ and H → WW analyses are presented, focusing on the Higgs boson properties, like the mass, the signal strenght, the couplings to fermions and vector bosons, the spin and parity properties. Furthermore, the Higgs boson width measurement exploiting the on-shell versus the off-shell cross section (in the H → ZZ → 4ℓ and H → ZZ → 2ℓ2ν decay channels) will be shown. All the investigated properties result to be fully consistent with the SM predictions: the signal strength and the signal strength modifiers are consistent with unity in all the bosonic channels considered; the hypothesis of a scalar particle is strongly favored, against the pseudoscalar or the vector/pseudovector or the spin-2 boson hypotheses (all excluded at 99% CL or higher in the H → ZZ → 4ℓ channel). The Higgs boson mass measurement from the combination of H → ZZ → 4ℓ and H → γγ channels gives a value mH = 125.03+0.26-0.27 (stat.) +0.13-0.15 (syst.). An upper limit ΓH < 22 MeV can be put on the Higgs boson width thanks to the new indirect method.

  11. Agricultural Land Use mapping by multi-sensor approach for hydrological water quality monitoring

    NASA Astrophysics Data System (ADS)

    Brodsky, Lukas; Kodesova, Radka; Kodes, Vit

    2010-05-01

    The main objective of this study is to demonstrate potential of operational use of the high and medium resolution remote sensing data for hydrological water quality monitoring by mapping agriculture intensity and crop structures. In particular use of remote sensing mapping for optimization of pesticide monitoring. The agricultural mapping task is tackled by means of medium spatial and high temporal resolution ESA Envisat MERIS FR images together with single high spatial resolution IRS AWiFS image covering the whole area of interest (the Czech Republic). High resolution data (e.g. SPOT, ALOS, Landsat) are often used for agricultural land use classification, but usually only at regional or local level due to data availability and financial constraints. AWiFS data (nominal spatial resolution 56 m) due to the wide satellite swath seems to be more suitable for use at national level. Nevertheless, one of the critical issues for such a classification is to have sufficient image acquisitions over the whole vegetation period to describe crop development in appropriate way. ESA MERIS middle-resolution data were used in several studies for crop classification. The high temporal and also spectral resolution of MERIS data has indisputable advantage for crop classification. However, spatial resolution of 300 m results in mixture signal in a single pixel. AWiFS-MERIS data synergy brings new perspectives in agricultural Land Use mapping. Also, the developed methodology procedure is fully compatible with future use of ESA (GMES) Sentinel satellite images. The applied methodology of hybrid multi-sensor approach consists of these main stages: a/ parcel segmentation and spectral pre-classification of high resolution image (AWiFS); b/ ingestion of middle resolution (MERIS) vegetation spectro-temporal features; c/ vegetation signatures unmixing; and d/ semantic object-oriented classification of vegetation classes into final classification scheme. These crop groups were selected to be

  12. The Effect of Concept Mapping-Guided Discovery Integrated Teaching Approach on Chemistry Students' Achievement and Retention

    ERIC Educational Resources Information Center

    Fatokun, K. V. F.; Eniayeju, P. A.

    2014-01-01

    This study investigates the effects of Concept Mapping-Guided Discovery Integrated Teaching Approach on the achievement and retention of chemistry students. The sample comprised 162 Senior Secondary two (SS 2) students drawn from two Science Schools in Nasarawa State, Central Nigeria with equivalent mean scores of 9.68 and 9.49 in their pre-test.…

  13. Stimulating Graphical Summarization in Late Elementary Education: The Relationship between Two Instructional Mind-Map Approaches and Student Characteristics

    ERIC Educational Resources Information Center

    Merchie, Emmelien; Van Keer, Hilde

    2016-01-01

    This study examined the effectiveness of two instructional mind-mapping approaches to stimulate fifth and sixth graders' graphical summarization skills. Thirty-five fifth- and sixth-grade teachers and 644 students from 17 different elementary schools participated. A randomized quasi-experimental repeated-measures design was set up with two…

  14. Mapping Trends in Pedagogical Approaches and Learning Technologies: Perspectives from the Canadian, International, and Military Education Contexts

    ERIC Educational Resources Information Center

    Scoppio, Grazia; Covell, Leigha

    2016-01-01

    Increased technological advances, coupled with new learners' needs, have created new realities for higher education contexts. This study explored and mapped trends in pedagogical approaches and learning technologies in postsecondary education and identified how these innovations are affecting teaching and learning practices in higher education…

  15. Quartic gauge boson couplings

    NASA Astrophysics Data System (ADS)

    He, Hong-Jian

    1998-08-01

    We review the recent progress in studying the anomalous electroweak quartic gauge boson couplings (QGBCs) at the LHC and the next generation high energy e±e- linear colliders (LCs). The main focus is put onto the strong electroweak symmetry breaking scenario in which the non-decoupling guarantees sizable new physics effects for the QGBCs. After commenting upon the current low energy indirect bounds and summarizing the theoretical patterns of QGBCs predicted by the typical resonance/non-resonance models, we review our systematic model-independent analysis on bounding them via WW-fusion and WWZ/ZZZ-production. The interplay of the two production mechanisms and the important role of the beam-polarization at the LCs are emphasized. The same physics may be similarly and better studied at a multi-TeV muon collider with high luminosity.

  16. Dark light Higgs bosons.

    SciTech Connect

    Draper, P.; Liu, T.; Wagner, C. E. M.; Wang, L.-T.; Zhang, H.

    2011-03-24

    We study a limit of the nearly Peccei-Quinn-symmetric next-to-minimal supersymmetric standard model possessing novel Higgs and dark matter (DM) properties. In this scenario, there naturally coexist three light singletlike particles: a scalar, a pseudoscalar, and a singlinolike DM candidate, all with masses of order 0.1-10 GeV. The decay of a standard model-like Higgs boson to pairs of the light scalars or pseudoscalars is generically suppressed, avoiding constraints from collider searches for these channels. For a certain parameter window annihilation into the light pseudoscalar and exchange of the light scalar with nucleons allow the singlino to achieve the correct relic density and a large direct-detection cross section consistent with the DM direct-detection experiments, CoGeNT and DAMA/LIBRA, preferred region simultaneously. This parameter space is consistent with experimental constraints from LEP, the Tevatron, ?, and flavor physics.

  17. Mapping Agricultural Fields in Sub-Saharan Africa with a Computer Vision Approach

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Luo, D.; Estes, L. D.; Fuchs, T.; Caylor, K. K.

    2014-12-01

    Sub-Saharan Africa is an important focus for food security research, because it is experiencing unprecedented population growth, agricultural activities are largely dominated by smallholder production, and the region is already home to 25% of the world's undernourished. One of the greatest challenges to monitoring and improving food security in this region is obtaining an accurate accounting of the spatial distribution of agriculture. Households are the primary units of agricultural production in smallholder communities and typically rely on small fields of less than 2 hectares. Field sizes are directly related to household crop productivity, management choices, and adoption of new technologies. As population and agriculture expand, it becomes increasingly important to understand both the distribution of field sizes as well as how agricultural communities are spatially embedded in the landscape. In addition, household surveys, a common tool for tracking agricultural productivity in Sub-Saharan Africa, would greatly benefit from spatially explicit accounting of fields. Current gridded land cover data sets do not provide information on individual agricultural fields or the distribution of field sizes. Therefore, we employ cutting edge approaches from the field of computer vision to map fields across Sub-Saharan Africa, including semantic segmentation, discriminative classifiers, and automatic feature selection. Our approach aims to not only improve the binary classification accuracy of cropland, but also to isolate distinct fields, thereby capturing crucial information on size and geometry. Our research focuses on the development of descriptive features across scales to increase the accuracy and geographic range of our computer vision algorithm. Relevant data sets include high-resolution remote sensing imagery and Landsat (30-m) multi-spectral imagery. Training data for field boundaries is derived from hand-digitized data sets as well as crowdsourcing.

  18. Factors Influencing Seasonal Influenza Vaccination Uptake in Emergency Medical Services Workers: A Concept Mapping Approach.

    PubMed

    Subramaniam, Dipti P; Baker, Elizabeth A; Zelicoff, Alan P; Elliott, Michael B

    2016-08-01

    Seasonal influenza has serious impacts on morbidity and mortality and has a significant economic toll through lost workforce time and strains on the health system. Health workers, particularly emergency medical services (EMS) workers have the potential to transmit influenza to those in their care, yet little is known of the factors that influence EMS workers' decisions regarding seasonal influenza vaccination (SIV) uptake, a key factor in reducing potential for transmitting disease. This study utilizes a modified Theory of Planned Behavior (TPB) model as a guiding framework to explore the factors that influence SIV uptake in EMS workers. Concept mapping, which consists of six-stages (preparation, generation, structuring, representation, interpretation, and utilization) that use quantitative and qualitative approaches, was used to identify participants' perspectives towards SIV. This study identified nine EMS-conceptualized factors that influence EMS workers' vaccination intent and behavior. The EMS-conceptualized factors align with the modified TPB model and suggest the need to consider community-wide approaches that were not initially conceptualized in the model. Additionally, the expansion of non-pharmaceutical measures went above and beyond original conceptualization. Overall, this study demonstrates the need to develop customized interventions such as messages highlighting the importance of EMS workers receiving SIV as the optimum solution. EMS workers who do not intend to receive the SIV should be provided with accurate information on the SIV to dispel misconceptions. Finally, EMS workers should also receive interventions which promote voluntary vaccination, encouraging them to be proactive in the health decisions they make for themselves. PMID:26721630

  19. Modeling and mapping potential distribution of Crimean juniper (Juniperus excelsa Bieb.) using correlative approaches.

    PubMed

    Özkan, Kürşad; Şentürk, Özdemir; Mert, Ahmet; Negiz, Mehmet Güvenç

    2015-01-01

    Modeling and mapping potential distribution of living organisms has become an important component of conservation planning and ecosystem management in recent years. Various correlative and mechanistic methods can be applied to build predictive distributions of living organisms in terrestrial and marine ecosystems. Correlative methods used to predict species' potential distribution have been described as either group discrimination techniques or profile techniques. We attempted to determine whether group discrimination techniques could perform as well as profile techniques for predicting species potential distributions, using elevation (ELVN), parent material (ROCK), slope (SLOP), radiation index (RI) and topographic position index (TPI)) as explanatory variables. We compared potential distribution predictions made for Crimean juniper (Juniperus excelsa Bieb.) in the Yukan Gokdere forest district of the Mediterranean region, Turkey, applying four group discrimination techniques (discriminate analysis (DA), logistic regression analysis (LR), generalized addictive model (GAM) and classification tree technique (CT)) and two profile techniques (a maximum entropy approach to species distribution modeling (MAXENT), the genetic algorithm for rule-set prediction (GARP)). Visual assessments of the potential distribution probability of the applied models for Crimean juniper were performed by using geographical information systems (GIS). Receiver-operating characteristic (ROC) curves were used to objectively assess model performance. The results suggested that group discrimination techniques are better than profile techniques and, among the group discrimination techniques, GAM indicated the best performance.

  20. Modeling and mapping potential distribution of Crimean juniper (Juniperus excelsa Bieb.) using correlative approaches.

    PubMed

    Özkan, Kürşad; Şentürk, Özdemir; Mert, Ahmet; Negiz, Mehmet Güvenç

    2015-01-01

    Modeling and mapping potential distribution of living organisms has become an important component of conservation planning and ecosystem management in recent years. Various correlative and mechanistic methods can be applied to build predictive distributions of living organisms in terrestrial and marine ecosystems. Correlative methods used to predict species' potential distribution have been described as either group discrimination techniques or profile techniques. We attempted to determine whether group discrimination techniques could perform as well as profile techniques for predicting species potential distributions, using elevation (ELVN), parent material (ROCK), slope (SLOP), radiation index (RI) and topographic position index (TPI)) as explanatory variables. We compared potential distribution predictions made for Crimean juniper (Juniperus excelsa Bieb.) in the Yukan Gokdere forest district of the Mediterranean region, Turkey, applying four group discrimination techniques (discriminate analysis (DA), logistic regression analysis (LR), generalized addictive model (GAM) and classification tree technique (CT)) and two profile techniques (a maximum entropy approach to species distribution modeling (MAXENT), the genetic algorithm for rule-set prediction (GARP)). Visual assessments of the potential distribution probability of the applied models for Crimean juniper were performed by using geographical information systems (GIS). Receiver-operating characteristic (ROC) curves were used to objectively assess model performance. The results suggested that group discrimination techniques are better than profile techniques and, among the group discrimination techniques, GAM indicated the best performance. PMID:26591876

  1. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M.

    2006-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  2. High Detailed Debris Flows Hazard Maps by a Cellular Automata Approach

    NASA Astrophysics Data System (ADS)

    Lupiano, V.; Lucà, F.; Robustelli, G.; Rongo, R.; D'Ambrosio, D.; Spataro, W.; Avolio, M. V.

    2012-04-01

    The individuation of areas that are more likely to be interested by new debris flows in regions that are particularly exposed to such kind of phenomena is of fundamental relevance for mitigating possible consequences, both in terms of loss of human lives and material properties. Here we show the adaption of a recent methodology, already successfully applied to lava flows, for defining flexible high-detailed and reliable hazard maps. The methodology relies on both an adequate knowledge of the study area, assessed by an accurate analysis of its past behavior, together with a reliable numerical model for simulating debris flows on present topographic data (the Cellular Automata model SCIDDICA, in the present case). Furthermore, High Performance Parallel Computing is employed for increasing computational efficiency, due to the great number of simulations of hypothetical events that are required for characterizing the susceptibility to flow invasion of the study area. The application of the presented methodology to the case of Gragnano (Italy) pointed out the goodness of the proposed approach, suggesting its appropriateness for land use planning and Civil Defense applications.

  3. Fractionation profiling: a fast and versatile approach for mapping vesicle proteomes and protein–protein interactions

    PubMed Central

    Borner, Georg H. H.; Hein, Marco Y.; Hirst, Jennifer; Edgar, James R.; Mann, Matthias; Robinson, Margaret S.

    2014-01-01

    We developed “fractionation profiling,” a method for rapid proteomic analysis of membrane vesicles and protein particles. The approach combines quantitative proteomics with subcellular fractionation to generate signature protein abundance distribution profiles. Functionally associated groups of proteins are revealed through cluster analysis. To validate the method, we first profiled >3500 proteins from HeLa cells and identified known clathrin-coated vesicle proteins with >90% accuracy. We then profiled >2400 proteins from Drosophila S2 cells, and we report the first comprehensive insect clathrin-coated vesicle proteome. Of importance, the cluster analysis extends to all profiled proteins and thus identifies a diverse range of known and novel cytosolic and membrane-associated protein complexes. We show that it also allows the detailed compositional characterization of complexes, including the delineation of subcomplexes and subunit stoichiometry. Our predictions are presented in an interactive database. Fractionation profiling is a universal method for defining the clathrin-coated vesicle proteome and may be adapted for the analysis of other types of vesicles and particles. In addition, it provides a versatile tool for the rapid generation of large-scale protein interaction maps. PMID:25165137

  4. Multimodality approach to optical early detection and mapping of oral neoplasia

    NASA Astrophysics Data System (ADS)

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-07-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins.

  5. Multimodality approach to optical early detection and  mapping of oral neoplasia

    PubMed Central

    Ahn, Yeh-Chan; Chung, Jungrae; Wilder-Smith, Petra; Chen, Zhongping

    2011-01-01

    Early detection of cancer remains the best way to ensure patient survival and quality of life. Squamous cell carcinoma is usually preceded by dysplasia presenting as white, red, or mixed red and white epithelial lesions on the oral mucosa (leukoplakia, erythroplakia). Dysplastic lesions in the form of erythroplakia can carry a risk for malignant conversion of 90%. A noninvasive diagnostic modality would enable monitoring of these lesions at regular intervals and detection of treatment needs at a very early, relatively harmless stage. The specific aim of this work was to test a multimodality approach [three-dimensional optical coherence tomography (OCT) and polarimetry] to noninvasive diagnosis of oral premalignancy and malignancy using the hamster cheek pouch model (nine hamsters). The results were compared to tissue histopathology. During carcinogenesis, epithelial down grow, eventual loss of basement membrane integrity, and subepithelial invasion were clearly visible with OCT. Polarimetry techniques identified a four to five times increased retardance in sites with squamous cell carcinoma, and two to three times greater retardance in dysplastic sites than in normal tissues. These techniques were particularly useful for mapping areas of field cancerization with multiple lesions, as well as lesion margins. PMID:21806268

  6. A Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) Determined from Phased Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Brooks, Thomas F.; Humphreys, William M., Jr.

    2004-01-01

    Current processing of acoustic array data is burdened with considerable uncertainty. This study reports an original methodology that serves to demystify array results, reduce misinterpretation, and accurately quantify position and strength of acoustic sources. Traditional array results represent noise sources that are convolved with array beamform response functions, which depend on array geometry, size (with respect to source position and distributions), and frequency. The Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) method removes beamforming characteristics from output presentations. A unique linear system of equations accounts for reciprocal influence at different locations over the array survey region. It makes no assumption beyond the traditional processing assumption of statistically independent noise sources. The full rank equations are solved with a new robust iterative method. DAMAS is quantitatively validated using archival data from a variety of prior high-lift airframe component noise studies, including flap edge/cove, trailing edge, leading edge, slat, and calibration sources. Presentations are explicit and straightforward, as the noise radiated from a region of interest is determined by simply summing the mean-squared values over that region. DAMAS can fully replace existing array processing and presentations methodology in most applications. It appears to dramatically increase the value of arrays to the field of experimental acoustics.

  7. A hybrid model for mapping simplified seismic response via a GIS-metamodel approach

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Revellino, P.; Guerriero, L.; Guadagno, F. M.

    2014-07-01

    In earthquake-prone areas, site seismic response due to lithostratigraphic sequence plays a key role in seismic hazard assessment. A hybrid model, consisting of GIS and metamodel (model of model) procedures, was introduced aimed at estimating the 1-D spatial seismic site response in accordance with spatial variability of sediment parameters. Inputs and outputs are provided and processed by means of an appropriate GIS model, named GIS Cubic Model (GCM). This consists of a block-layered parametric structure aimed at resolving a predicted metamodel by means of pixel to pixel vertical computing. The metamodel, opportunely calibrated, is able to emulate the classic shape of the spectral acceleration response in relation to the main physical parameters that characterize the spectrum itself. Therefore, via the GCM structure and the metamodel, the hybrid model provides maps of normalized acceleration response spectra. The hybrid model was applied and tested on the built-up area of the San Giorgio del Sannio village, located in a high-risk seismic zone of southern Italy. Efficiency tests showed a good correspondence between the spectral values resulting from the proposed approach and the 1-D physical computational models. Supported by lithology and geophysical data and corresponding accurate interpretation regarding modelling, the hybrid model can be an efficient tool in assessing urban planning seismic hazard/risk.

  8. Visualization pipeline for GIS-based planetary mapping - Cartographic approaches for geological and geomorphological interpretation results

    NASA Astrophysics Data System (ADS)

    Nass, A.; van Gasselt, S.; Roatsch, T.; Hauber, E.; Jaumann, R.

    2012-09-01

    A number of international research institutes and groups predominantly incorporate remote-sensing data for geological and geomorphological interpretations of planetary surfaces. By employing state-of-the-art Geographic Information Systems (GISs) and technologies with tools for data processing, management and visualization the results are represented in thematic maps with different topical foci. In order to combine different digital maps and to manage different interpretation results in one single database, a streamlined and homogenized method of GIS-based mapping is required.

  9. Quantitative urban climate mapping based on a geographical database: A simulation approach using Hong Kong as a case study

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Ng, Edward

    2011-08-01

    The urban environment has been dramatically changed by artificial constructions. How the modified urban geometry affects the urban climate and therefore human thermal comfort has become a primary concern for urban planners. The present study takes a simulation approach to analyze the influence of urban geometry on the urban climate and maps this climatic understanding from a quantitative perspective. A geographical building database is used to characterize two widely discussed aspects: urban heat island effect (UHI) and wind dynamics. The parameters of the sky view factor (SVF) and the frontal area density (FAD) are simulated using ArcGIS-embedded computer programs to link urban geometry with the UHI and wind dynamic conditions. The simulated results are synergized and classified to evaluate different urban climatic conditions based on thermal comfort consideration. A climatic map is then generated implementing the classification. The climatic map shows reasonable agreement with thermal comfort understanding, as indicated by the biometeorological index of the physiological equivalent temperature (PET) obtained in an earlier study. The proposed climate mapping approach can provide both quantitative and visual evaluation of the urban environment for urban planners with climatic concerns. The map could be used as a decision support tool in planning and policy-making processes. An urban area in Hong Kong is used as a case study.

  10. A Two-Layers Based Approach of an Enhanced-Map for Urban Positioning Support

    PubMed Central

    Piñana-Díaz, Carolina; Toledo-Moreo, Rafael; Toledo-Moreo, F. Javier; Skarmeta, Antonio

    2012-01-01

    This paper presents a two-layer based enhanced map that can support navigation in urban environments. One layer is dedicated to describe the drivable road with a special focus on the accurate description of its bounds. This feature can support positioning and advanced map-matching when compared with standard polyline-based maps. The other layer depicts building heights and locations, thus enabling the detection of non-line-of-sight signals coming from GPS satellites not in direct view. Both the concept and the methodology for creating these enhanced maps are shown in the paper. PMID:23202172

  11. Mapping the World - a New Approach for Volunteered Geographic Information in the Cloud

    NASA Astrophysics Data System (ADS)

    Moeller, M. S.; Furhmann, S.

    2015-05-01

    The OSM project provides a geodata basis for the entire world under the CC-SA licence agreement. But some parts of the world are mapped more densely compared to other regions. However, many less developed countries show a lack of valid geo-information. Africa for example is a sparsely mapped continent. During a huge Ebola outbreak in 2014 the lack of data became apparent. Help organization like the American Red Cross and the Humanitarian Openstreetmap Team organized mappings campaign to fill the gaps with valid OSM geodata. This paper gives a short introduction into this mapping activity.

  12. A hybrid wetland map for China: a synergistic approach using census and spatially explicit datasets.

    PubMed

    Ma, Kun; You, Liangzhi; Liu, Junguo; Zhang, Mingxiang

    2012-01-01

    Wetlands play important ecological, economic, and cultural roles in societies around the world. However, wetland degradation has become a serious ecological issue, raising the global sustainability concern. An accurate wetland map is essential for wetland management. Here we used a fuzzy method to create a hybrid wetland map for China through the combination of five existing wetlands datasets, including four spatially explicit wetland distribution data and one wetland census. Our results show the total wetland area is 384,864 km(2), 4.08% of China's national surface area. The hybrid wetland map also shows spatial distribution of wetlands with a spatial resolution of 1 km. The reliability of the map is demonstrated by comparing it with spatially explicit datasets on lakes and reservoirs. The hybrid wetland map is by far the first wetland mapping that is consistent with the statistical data at the national and provincial levels in China. It provides a benchmark map for research on wetland protection and management. The method presented here is applicable for not only wetland mapping but also for other thematic mapping in China and beyond.

  13. A Hybrid Wetland Map for China: A Synergistic Approach Using Census and Spatially Explicit Datasets

    PubMed Central

    Ma, Kun; You, Liangzhi; Liu, Junguo; Zhang, Mingxiang

    2012-01-01

    Wetlands play important ecological, economic, and cultural roles in societies around the world. However, wetland degradation has become a serious ecological issue, raising the global sustainability concern. An accurate wetland map is essential for wetland management. Here we used a fuzzy method to create a hybrid wetland map for China through the combination of five existing wetlands datasets, including four spatially explicit wetland distribution data and one wetland census. Our results show the total wetland area is 384,864 km2, 4.08% of China’s national surface area. The hybrid wetland map also shows spatial distribution of wetlands with a spatial resolution of 1 km. The reliability of the map is demonstrated by comparing it with spatially explicit datasets on lakes and reservoirs. The hybrid wetland map is by far the first wetland mapping that is consistent with the statistical data at the national and provincial levels in China. It provides a benchmark map for research on wetland protection and management. The method presented here is applicable for not only wetland mapping but also for other thematic mapping in China and beyond. PMID:23110105

  14. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach.

    PubMed

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier's Scopus database over the period 1980-2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980-1989 and they reached 3% (N = 69) in the time slice 1990-1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010-2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980-1994, the research topics were poorly characterized, but they become better defined during the time slice 1995-1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses revealed highly

  15. Unravelling the impact of inheritance within the Wilson Cycle: a combined mapping and numerical modelling approach

    NASA Astrophysics Data System (ADS)

    Chenin, Pauline; Manatschal, Gianreto; Lavier, Luc

    2015-04-01

    Our study aims to unravel how structural, lithological and thermal heterogeneities may influence both orogenic and rift systems within the Wilson Cycle. To do this, we map first-order rift structural domains, timing of the main rift events as well as major heterogeneities and structures inherited from previous orogenies. Besides, we design numerical modelling experiments to investigate the relationships highlighted from the comparison of these maps. We apply this approach to the North Atlantic region, which underwent two major orogenic phases during the Palaeozoic: (1) the Caledonian orogeny - now extending from United-Kingdom to northern Norway and Eastern Greenland - resulted from the Late Ordovician closure of the large Iapetus ocean (> 2 000 km) and smaller Tornquist Seaway. It was followed by purely mechanical extensional orogenic collapse; (2) the Variscides of Southwestern Europe were essentially built from the Devono-Carboniferous suturing of several small oceanic basins (< 200 km) in addition to the large Rheic Ocean. The subsequent orogenic collapse was accompanied by significant magmatic activity, which resulted in mafic underplating and associated mantle depletion over the whole orogenic area. Our study is twofolds: On the one hand, we investigate how the size and maturity of the intervening oceanic basins affect subduction and orogeny, considering two end-members: (a) immature oceanic basins defined as hyperextended rift systems that never achieved steady state seafloor spreading; and (b) mature oceans characterized by a self-sustained magmatic system forming homogeneous oceanic crust. On the other hand, we study how post-orogenic collapse-related underplating and associated mantle depletion may impact subsequent rifting depending on the thermal state (e.g. the duration of relaxation time between the magmatic episode and the onset of rifting). Our results highlight a very different behaviour of the North Atlantic rift with respect to the Caledonian and

  16. Assessment of Social Vulnerability Identification at Local Level around Merapi Volcano - A Self Organizing Map Approach

    NASA Astrophysics Data System (ADS)

    Lee, S.; Maharani, Y. N.; Ki, S. J.

    2015-12-01

    The application of Self-Organizing Map (SOM) to analyze social vulnerability to recognize the resilience within sites is a challenging tasks. The aim of this study is to propose a computational method to identify the sites according to their similarity and to determine the most relevant variables to characterize the social vulnerability in each cluster. For this purposes, SOM is considered as an effective platform for analysis of high dimensional data. By considering the cluster structure, the characteristic of social vulnerability of the sites identification can be fully understand. In this study, the social vulnerability variable is constructed from 17 variables, i.e. 12 independent variables which represent the socio-economic concepts and 5 dependent variables which represent the damage and losses due to Merapi eruption in 2010. These variables collectively represent the local situation of the study area, based on conducted fieldwork on September 2013. By using both independent and dependent variables, we can identify if the social vulnerability is reflected onto the actual situation, in this case, Merapi eruption 2010. However, social vulnerability analysis in the local communities consists of a number of variables that represent their socio-economic condition. Some of variables employed in this study might be more or less redundant. Therefore, SOM is used to reduce the redundant variable(s) by selecting the representative variables using the component planes and correlation coefficient between variables in order to find the effective sample size. Then, the selected dataset was effectively clustered according to their similarities. Finally, this approach can produce reliable estimates of clustering, recognize the most significant variables and could be useful for social vulnerability assessment, especially for the stakeholder as decision maker. This research was supported by a grant 'Development of Advanced Volcanic Disaster Response System considering

  17. Structure and Evolution of Mediterranean Forest Research: A Science Mapping Approach

    PubMed Central

    Nardi, Pierfrancesco; Di Matteo, Giovanni; Palahi, Marc; Scarascia Mugnozza, Giuseppe

    2016-01-01

    This study aims at conducting the first science mapping analysis of the Mediterranean forest research in order to elucidate its research structure and evolution. We applied a science mapping approach based on co-term and citation analyses to a set of scientific publications retrieved from the Elsevier’s Scopus database over the period 1980–2014. The Scopus search retrieved 2,698 research papers and reviews published by 159 peer-reviewed journals. The total number of publications was around 1% (N = 17) during the period 1980–1989 and they reached 3% (N = 69) in the time slice 1990–1994. Since 1995, the number of publications increased exponentially, thus reaching 55% (N = 1,476) during the period 2010–2014. Within the thirty-four years considered, the retrieved publications were published by 88 countries. Among them, Spain was the most productive country, publishing 44% (N = 1,178) of total publications followed by Italy (18%, N = 482) and France (12%, N = 336). These countries also host the ten most productive scientific institutions in terms of number of publications in Mediterranean forest subjects. Forest Ecology and Management and Annals of Forest Science were the most active journals in publishing research in Mediterranean forest. During the period 1980–1994, the research topics were poorly characterized, but they become better defined during the time slice 1995–1999. Since 2000s, the clusters become well defined by research topics. Current status of Mediterranean forest research (20092014) was represented by four clusters, in which different research topics such as biodiversity and conservation, land-use and degradation, climate change effects on ecophysiological responses and soil were identified. Basic research in Mediterranean forest ecosystems is mainly conducted by ecophysiological research. Applied research was mainly represented by land-use and degradation, biodiversity and conservation and fire research topics. The citation analyses

  18. Mapping soil vulnerability to floods under varying land use and climate: A new approach

    NASA Astrophysics Data System (ADS)

    Alaoui, Abdallah; Spiess, Pascal; Beyeler, Marcel

    2016-04-01

    the hydrological connectivity between zones of various predisposition to excess surface runoff under different land uses. These promising results indicate that the approach is suited for mapping soil vulnerability to floods under varying land use and climate at any scale.

  19. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  20. Applying the spatial mapping approach to 231Pa/230Th as an overturning proxy

    NASA Astrophysics Data System (ADS)

    Bradtmiller, L. I.; McManus, J. F.; Robinson, L. F.

    2008-12-01

    The use of the 231Pa/230Th ratio in deep-sea sediments has been developed and used over the last decade as a proxy for the rate of Atlantic meridional overturning circulation (AMOC). The proxy is based on the known ratio of 231Pa and 230Th production by uranium decay in the ocean, and on the different rates of removal to the sediment of the two isotopes. North Atlantic climate and AMOC are believed to be closely related, and so the 231Pa/230Th proxy has most often been applied to North Atlantic sediments over the past glacial cycle, particularly during periods of abrupt climate change such as the Heinrich 1 (H1) iceberg discharge event. Recent studies have used high-resolution downcore records to interpret AMOC circulation at a single location. Although powerful, this approach cannot always rule out local changes in sediment composition, particle rain rate or other factors influencing the 231Pa/230Th ratio, and therefore may not necessarily reflect the mean behavior of AMOC. Here we combine new and existing 231Pa/230Th data from the Atlantic basin to apply the spatial mapping approach to the 231Pa/230Th proxy. Instead of attempting to reconstruct AMOC at a single site, we use weighted averages of spatially distributed data from the last glacial maximum, H1 and the Holocene in an attempt to examine these three key time periods with respect to the average behavior of the AMOC. This approach greatly decreases the likelihood that the results are biased by variations in factors other than the AMOC, allowing us to examine 231Pa/230Th through time as well as in three- dimensional space. Compilation of existing data highlights key gaps in the spatial coverage and is complicated by the challenge of identifying H1 in all cores. Nevertheless we are able to determine broad spatial patterns and calculate 231Pa budgets where suitable data exists. We show that the minimum net export of 231Pa form the North Atlantic by the AMOC occurred during relatively brief intervals such as H1

  1. Chiral Bosonization of Superconformal Ghosts

    NASA Technical Reports Server (NTRS)

    Shi, Deheng; Shen, Yang; Liu, Jinling; Xiong, Yongjian

    1996-01-01

    We explain the difference of the Hilbert space of the superconformal ghosts (beta,gamma) system from that of its bosonized fields phi and chi. We calculate the chiral correlation functions of phi, chi fields by inserting appropriate projectors.

  2. What is a Higgs Boson?

    SciTech Connect

    Lincoln, Don

    2011-07-07

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  3. What is a Higgs Boson?

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Fermilab scientist Don Lincoln describes the nature of the Higgs boson. Several large experimental groups are hot on the trail of this elusive subatomic particle which is thought to explain the origins of particle mass.

  4. Ontology Mapping Neural Network: An Approach to Learning and Inferring Correspondences among Ontologies

    ERIC Educational Resources Information Center

    Peng, Yefei

    2010-01-01

    An ontology mapping neural network (OMNN) is proposed in order to learn and infer correspondences among ontologies. It extends the Identical Elements Neural Network (IENN)'s ability to represent and map complex relationships. The learning dynamics of simultaneous (interlaced) training of similar tasks interact at the shared connections of the…

  5. A Constructivist Approach to Designing Computer Supported Concept-Mapping Environment

    ERIC Educational Resources Information Center

    Cheung, Li Siu

    2006-01-01

    In the past two decades, there has been a proliferation of research activities on exploring the use of concept maps to support teaching and learning of various knowledge disciplines which range from science to language subjects. MindNet, which is a collaborative concept mapping environment that supports both synchronous and asynchronous modes of…

  6. An Innovative Concept Map Approach for Improving Students' Learning Performance with an Instant Feedback Mechanism

    ERIC Educational Resources Information Center

    Wu, Po-Han; Hwang, Gwo-Jen; Milrad, Marcelo; Ke, Hui-Ru; Huang, Yueh-Min

    2012-01-01

    Concept maps have been widely employed for helping students organise their knowledge as well as evaluating their knowledge structures in a wide range of subject matters. Although researchers have recognised concept maps as being an important educational tool, past experiences have also revealed the difficulty of evaluating the correctness of a…

  7. Discovery Maps: A Student-Centered Approach To Reinforcing the Curriculum.

    ERIC Educational Resources Information Center

    Chase, Patricia A.; Franson, Kari L.; An, Ariane

    2001-01-01

    Describes a project which involved first- and second-year pharmacy students in making Discovery Maps focused on specific diseases. The maps were intended to help students remember and relate key pieces of information from all their courses. Feedback from students and faculty was very positive. (EV)

  8. Concept Mapping: An Approach for Evaluating a Public Alternative School Program

    ERIC Educational Resources Information Center

    Streeter, Calvin L.; Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2011-01-01

    This article describes how concept mapping techniques were applied to evaluate the development of a solution-focused, public alternative school program. Concept Systems software was used to create 15 cluster maps based on statements generated from students, teachers, and school staff. In addition, pattern matches were analyzed to examine the…

  9. Mapping and monitoring cropland burning in European Russia: a multi-sensor approach

    NASA Astrophysics Data System (ADS)

    Hall, J.; Loboda, T. V.; Mccarty, G.; McConnell, L.; Woldemariam, T.

    2013-12-01

    Short lived aerosols and pollutants transported from high northern latitudes have amplified the short term warming in the Arctic region. Specifically, black carbon (BC) is recognized as the second most important human emission in regards to climate forcing, behind carbon dioxide with a total climate forcing of +1.1Wm-2. Early studies have suggested that cropland burning may be a high contributor to the BC emissions which are directly deposited above the Arctic Circle. However, accurate monitoring of cropland burning from existing active fire and burned area products is limited. Most existing algorithms are focused on mapping hotter and larger wildfire events. The timing of cropland burning differs from wildfire events and their transient nature adds a further challenge to the product development. In addition, the analysis of multi-year cloud cover over Russian croplands, using the Moderate Resolution Imaging Spectroradiometer (MODIS) daily surface reflectance data showed that on average early afternoon observations from MODIS/ Aqua provided 68 clear views per growing period (defined 1st March 2003 - 30th November 2012) with a range from 30 to 101 clear views; whereas MODIS/Terra provided 75 clear views per growing period (defined 1st March 2001 - 30th November 2012) with a range from 37 to 113 clear views. Here we present a new approach to burned area mapping in croplands from satellite imagery. Our algorithm is designed to detect burned area only within croplands and does not have the requirements to perform well outside those. The algorithm focuses on tracking the natural intra-annual development curve specific for crops rather than natural vegetation and works by identifying the subtle spectral nuances between varieties of cropland field categories. Using a combination of the high visual accuracy from very high resolution (VHR, defined as spatial resolution < 5m) imagery and the temporal trend of MODIS data, we are able to differentiate between burned and plowed

  10. An integrated approach to flood hazard assessment on alluvial fans using numerical modeling, field mapping, and remote sensing

    USGS Publications Warehouse

    Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.

    2005-01-01

    Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In

  11. Bedding control on landslides: a methodological approach for computer-aided mapping analysis

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Revellino, P.; Donnarumma, A.; Guadagno, F. M.

    2011-05-01

    Litho-structural control on the spatial and temporal evolution of landslides is one of the major typical aspects on slopes constituted of structurally complex sequences. Mainly focused on instabilities of the earth flow type, a semi-quantitative analysis has been developed with the purpose of identifying and characterizing litho-structural control exerted by bedding on slopes and its effects on landsliding. In quantitative terms, a technique for azimuth data interpolation, Non-continuous Azimuth Distribution Methodological Approach (NADIA), is presented by means of a GIS software application. In addition, processed by NADIA, two indexes have been determined: (i) Δ, aimed at defining the relationship between the orientation of geological bedding planes and slope aspect, and (ii) C, which recognizes localized slope sectors in which the stony component of structurally complex formations is abundant and therefore operates an evolutive control of landslide masses. Furthermore, some Litho-Structural Models (LSMs) of slopes are proposed aiming at characterizing recurrent forms of structural control in the source, channel and deposition areas of gravitational movements. In order to elaborate evolutive models controlling landslide scenarios, LSMs were qualitatively related and compared with Δ and C quantitative indexes. The methodological procedure has been applied to a lithostructurally complex area of Southern Italy where data about azimuth measurements and landslide mapping were known. It was found that the proposed methodology enables the recognition of typical control conditions on landslides in relation to the LSMs. Different control patterns on landslide shape and on style and distribution of the activity resulted for each LSM. This provides the possibility for first-order identification to be made of the spatial evolution of landslide bodies.

  12. A Geospatial Approach to Mapping Bioenergy Potential of Perennial Crops in North American Tallgrass Prairie

    NASA Astrophysics Data System (ADS)

    Wang, S.; Fritschi, F. B.; Stacy, G.

    2009-12-01

    Biomass is the largest source of renewable energy in the United States and is expected to replace 30% of the domestic petroleum consumption by 2030. Corn ethanol currently constitutes 99% of the country’s biofuels. Extended annual crop planting for biofuel production, however, has raised concerns about long-term environmental, ecological and socio-economical consequences. More sustainable bioenergy resources might therefore be developed to meet the energy demand, food security and climate policy. The DOD has identified switchgrass (Panicum virgatum L.) as a model bioenergy crop. Switchgrass, along with other warm-season grasses, is native to the pre-colonial tallgrass prairie in North America. This study maps the spatial distributions of prairie grasses and marginal croplands in the tallgrass prairie with remote sensing and GIS techniques. In 2000-2008, the 8-day composition MODIS imagery was downloaded to calculate the normalized difference vegetation index (NDVI). With pixel-level temporal trajectory of NDVI, time-series trend analysis was performed to identify native prairie grasses based on their phenological uniqueness. In a case study in southwest Missouri, this trajectory approach distinguished more than 80% of warm-season prairie grasslands from row crops and cool-season pastures (Figure 1). Warm season grasses dominated in the 19 public prairies in the study area in a range of 45-98%. This study explores the geographic context of current and potential perennial bioenergy supplies in the tallgrass prairie. Beyond the current findings, it holds promise for further investigations to provide quantitative economic and environmental information in assisting bioenergy policy decision-making. Figure 1 The distribution of grasslands in the study area. The "WSG", "CSG" and “non-grass” represent warm-season prairie grasses, introduced cool-season grasses and crops and other non-grasses.

  13. Coastal system mapping: a new approach to formalising and conceptualising the connectivity of large-scale coastal systems

    NASA Astrophysics Data System (ADS)

    French, J.; Burningham, H.; Whitehouse, R.

    2010-12-01

    The concept of the coastal sediment cell has proved invaluable as a basis for estimating sediment budgets and as a framework for coastal management. However, whilst coastal sediment cells are readily identified on compartmentalised coastlines dominated by beach-grade material, the cell concept is less suited to handling broader linkages between estuarine, coastal and offshore systems, and for incorporating longer-range suspended sediment transport. We present a new approach to the conceptualisation of large-scale coastal geomorphic systems based on a hierarchical classification of component landforms and management interventions and mapping of the interactions between them. Coastal system mapping is founded on a classification that identifies high-level landform features, low-level landform elements and engineering interventions. Geomorphic features define the large-scale organisation of a system and include landforms that define gross coastal configuration (e.g. headland, bay) as well as fluvial, estuarine and offshore sub-systems that exchange sediment with and influence the open coast. Detailed system structure is mapped out with reference to a larger set of geomorphic elements (e.g. cliff, dune, beach ridge). Element-element interactions define cross-shore linkages (conceptualised as hinterland, backshore and foreshore zones) and alongshore system structure. Both structural and non-structural engineering interventions are also represented at this level. Element-level mapping is rationalised to represent alongshore variation using as few elements as possible. System linkages include both sediment transfer pathways and influences not associated with direct mass transfer (e.g. effect of a jetty at an inlet). A formal procedure for capturing and graphically representing coastal system structure has been developed around free concept mapping software, CmapTools (http://cmap.ihmc.us). Appended meta-data allow geographic coordinates, data, images and literature

  14. De-pinning of disordered bosonic chains

    NASA Astrophysics Data System (ADS)

    Vogt, N.; Cole, J. H.; Shnirman, A.

    2016-05-01

    We consider onset of transport (de-pinning) in one-dimensional bosonic chains with a repulsive boson–boson interaction that decays exponentially on large length-scales. Our study is relevant for (i) de-pinning of Cooper-pairs in Josephson junction arrays; (ii) de-pinning of magnetic flux quanta in quantum-phase-slip ladders, i.e. arrays of superconducting wires in a ladder-configuration that allow for the coherent tunneling of flux quanta. In the low-frequency, long wave-length regime these chains can be mapped onto an effective model of a one-dimensional elastic field in a disordered potential. The standard de-pinning theories address infinitely long systems in two limiting cases: (a) of uncorrelated disorder (zero correlation length); (b) of long range power-law correlated disorder (infinite correlation length). In this paper we study numerically chains of finite length in the intermediate case of long but finite disorder correlation length. This regime is of relevance for, e.g., the experimental systems mentioned above. We study the interplay of three length scales: the system length, the interaction range, the correlation length of disorder. In particular, we observe the crossover between the solitonic onset of transport in arrays shorter than the disorder correlation length to onset of transport by de-pinning for longer arrays.

  15. Analytic boosted boson discrimination

    NASA Astrophysics Data System (ADS)

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-01

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D 2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits. By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. Our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.

  16. Analytic boosted boson discrimination

    DOE PAGESBeta

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2016-05-20

    Observables which discriminate boosted topologies from massive QCD jets are of great importance for the success of the jet substructure program at the Large Hadron Collider. Such observables, while both widely and successfully used, have been studied almost exclusively with Monte Carlo simulations. In this paper we present the first all-orders factorization theorem for a two-prong discriminant based on a jet shape variable, D2, valid for both signal and background jets. Our factorization theorem simultaneously describes the production of both collinear and soft subjets, and we introduce a novel zero-bin procedure to correctly describe the transition region between these limits.more » By proving an all orders factorization theorem, we enable a systematically improvable description, and allow for precision comparisons between data, Monte Carlo, and first principles QCD calculations for jet substructure observables. Using our factorization theorem, we present numerical results for the discrimination of a boosted Z boson from massive QCD background jets. We compare our results with Monte Carlo predictions which allows for a detailed understanding of the extent to which these generators accurately describe the formation of two-prong QCD jets, and informs their usage in substructure analyses. In conclusion, our calculation also provides considerable insight into the discrimination power and calculability of jet substructure observables in general.« less

  17. A National Approach for Mapping and Quantifying Habitat-based Biodiversity Metrics Across Multiple Spatial Scales

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, and environmental decision analysis. Mapping and quantifying ecosystem services have become strategic national inte...

  18. A National Approach to Quantify and Map Biodiversity Conservation Metrics within an Ecosystem Services Framework

    EPA Science Inventory

    Ecosystem services, i.e., "services provided to humans from natural systems," have become a key issue of this century in resource management, conservation planning, human well-being, and environmental decision analysis. Mapping and quantifying ecosystem services have be...

  19. Use of linkage disequilibrium approaches to map genes for bipolar disorder in the Costa Rican population

    SciTech Connect

    Escamilla, M.A.; Reus, V.I.; Smith, L.B.; Freimer, N.B.

    1996-05-31

    Linkage disequilibrium (LD) analysis provides a powerful means for screening the genome to map the location of disease genes, such as those for bipolar disorder (BP). As described in this paper, the population of the Central Valley of Costa Rica, which is descended from a small number of founders, should be suitable for LD mapping; this assertion is supported by reconstruction of extended haplotypes shared by distantly related individuals in this population suffering low-frequency hearing loss (LFHL1), which has previously been mapped by linkage analysis. A sampling strategy is described for applying LD methods to map genes for BP, and clinical and demographic characteristics of an initially collected sample are discussed. This sample will provide a complement to a previously collected set of Costa Rican BP families which is under investigation using standard linkage analysis. 42 refs., 4 figs., 2 tabs.

  20. Collaborative and Multilingual Approach to Learn Database Topics Using Concept Maps

    PubMed Central

    Calvo, Iñaki

    2014-01-01

    Authors report on a study using the concept mapping technique in computer engineering education for learning theoretical introductory database topics. In addition, the learning of multilingual technical terminology by means of the collaborative drawing of a concept map is also pursued in this experiment. The main characteristics of a study carried out in the database subject at the University of the Basque Country during the 2011/2012 course are described. This study contributes to the field of concept mapping as these kinds of cognitive tools have proved to be valid to support learning in computer engineering education. It contributes to the field of computer engineering education, providing a technique that can be incorporated with several educational purposes within the discipline. Results reveal the potential that a collaborative concept map editor offers to fulfil the above mentioned objectives. PMID:25538957

  1. The cancer experience map: an approach to including the patient voice in supportive care solutions.

    PubMed

    Hall, Leslie Kelly; Kunz, Breanne F; Davis, Elizabeth V; Dawson, Rose I; Powers, Ryan S

    2015-05-28

    The perspective of the patient, also called the "patient voice", is an essential element in materials created for cancer supportive care. Identifying that voice, however, can be a challenge for researchers and developers. A multidisciplinary team at a health information company tasked with addressing this issue created a representational model they call the "cancer experience map". This map, designed as a tool for content developers, offers a window into the complex perspectives inside the cancer experience. Informed by actual patient quotes, the map shows common overall themes for cancer patients, concerns at key treatment points, strategies for patient engagement, and targeted behavioral goals. In this article, the team members share the process by which they created the map as well as its first use as a resource for cancer support videos. The article also addresses the broader policy implications of including the patient voice in supportive cancer content, particularly with regard to mHealth apps.

  2. Two-dimensional thermofield bosonization II: Massive fermions

    SciTech Connect

    Amaral, R.L.P.G.

    2008-11-15

    We consider the perturbative computation of the N-point function of chiral densities of massive free fermions at finite temperature within the thermofield dynamics approach. The infinite series in the mass parameter for the N-point functions are computed in the fermionic formulation and compared with the corresponding perturbative series in the interaction parameter in the bosonized thermofield formulation. Thereby we establish in thermofield dynamics the formal equivalence of the massive free fermion theory with the sine-Gordon thermofield model for a particular value of the sine-Gordon parameter. We extend the thermofield bosonization to include the massive Thirring model.

  3. Quantum entanglement enhances the capacity of bosonic channels with memory

    SciTech Connect

    Cerf, Nicolas J.; Clavareau, Julien; Roland, Jeremie

    2005-10-15

    The bosonic quantum channels have recently attracted a growing interest, motivated by the hope that they open a tractable approach to the generally hard problem of evaluating quantum channel capacities. These studies, however, have always been restricted to memoryless channels. Here, it is shown that the classical capacity of a bosonic Gaussian channel with memory can be significantly enhanced if entangled symbols are used instead of product symbols. For example, the capacity of a photonic channel with 70%-correlated thermal noise of one-third the shot noise is enhanced by about 11% when using 3.8-dB entangled light with a modulation variance equal to the shot noise.

  4. Mapping land cover from satellite images: A basic, low cost approach

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barney, T. W.; Barr, D. J.; Johannsen, C. J.

    1978-01-01

    Simple, inexpensive methodologies developed for mapping general land cover and land use categories from LANDSAT images are reported. One methodology, a stepwise, interpretive, direct tracing technique was developed through working with university students from different disciplines with no previous experience in satellite image interpretation. The technique results in maps that are very accurate in relation to actual land cover and relative to the small investment in skill, time, and money needed to produce the products.

  5. An organized approach to the localization, mapping, and ablation of outflow tract ventricular arrhythmias.

    PubMed

    Hutchinson, Mathew D; Garcia, Fermin C

    2013-10-01

    The outflow tract (OT) regions of the right and left ventricles, common sites of origin for idiopathic ventricular arrhythmias (VA), have complex three-dimensional anatomical relationships. The understanding of in situ or "attitudinal" relationships not only informs the electrocardiographic interpretation of VA site of origin, but also facilitates their catheter-based mapping and ablation strategies. By viewing each patient as his or her own "control," the expected changes in ECG morphology (i.e., frontal plane QRS axis and precordial transition) between adjacent intracardiac structures (e.g., RVOT and aortic root) can be reliably predicted. Successful mapping of OT VAs involve a combination of activation and pacemapping guided by fluoroscopy, electroanatomical mapping, and intracardiac echocardiography. The purpose of this manuscript is to provide a simple, reliable strategy for catheter based mapping and ablation of OT VAs. We also discuss 2 specific challenges in OT VA mapping: (1) differentiating posterior RVOT from right coronary cusp VA origin; and (2) mapping VAs originating from the LV summit.

  6. Measure of tripartite entanglement in bosonic and fermionic systems

    SciTech Connect

    Buscemi, Fabrizio

    2011-08-15

    We describe an efficient theoretical criterion suitable for the evaluation of the tripartite entanglement of any mixed three-boson or three-fermion state, based on the notion of the entanglement of particles for bipartite systems of identical particles. Our approach allows one to quantify the accessible number of quantum correlations in the systems without any violation of the local particle number superselection rule. A generalization of the tripartite negativity is here applied to some correlated systems including the continuous-time quantum walks of identical particles (for both bosons and fermions) and compared with other criteria recently proposed in the literature. Our results show the dependence of the entanglement dynamics upon the quantum statistics: The bosonic bunching results in a low number of quantum correlations while Fermi-Dirac statistics allows for higher values of the entanglement.

  7. Decision and function problems based on boson sampling

    NASA Astrophysics Data System (ADS)

    Nikolopoulos, Georgios M.; Brougham, Thomas

    2016-07-01

    Boson sampling is a mathematical problem that is strongly believed to be intractable for classical computers, whereas passive linear interferometers can produce samples efficiently. So far, the problem remains a computational curiosity, and the possible usefulness of boson-sampling devices is mainly limited to the proof of quantum supremacy. The purpose of this work is to investigate whether boson sampling can be used as a resource of decision and function problems that are computationally hard, and may thus have cryptographic applications. After the definition of a rather general theoretical framework for the design of such problems, we discuss their solution by means of a brute-force numerical approach, as well as by means of nonboson samplers. Moreover, we estimate the sample sizes required for their solution by passive linear interferometers, and it is shown that they are independent of the size of the Hilbert space.

  8. Spin models and boson sampling

    NASA Astrophysics Data System (ADS)

    Garcia Ripoll, Juan Jose; Peropadre, Borja; Aspuru-Guzik, Alan

    Aaronson & Arkhipov showed that predicting the measurement statistics of random linear optics circuits (i.e. boson sampling) is a classically hard problem for highly non-classical input states. A typical boson-sampling circuit requires N single photon emitters and M photodetectors, and it is a natural idea to rely on few-level systems for both tasks. Indeed, we show that 2M two-level emitters at the input and output ports of a general M-port interferometer interact via an XY-model with collective dissipation and a large number of dark states that could be used for quantum information storage. More important is the fact that, when we neglect dissipation, the resulting long-range XY spin-spin interaction is equivalent to boson sampling under the same conditions that make boson sampling efficient. This allows efficient implementations of boson sampling using quantum simulators & quantum computers. We acknowledge support from Spanish Mineco Project FIS2012-33022, CAM Research Network QUITEMAD+ and EU FP7 FET-Open Project PROMISCE.

  9. Isotopomer mapping approach to determine N_{2}O production pathways and N_{2}O reduction

    NASA Astrophysics Data System (ADS)

    Lewicka-Szczebak, Dominika; Well, Reinhard; Cardenas, Laura; Bol, Roland

    2016-04-01

    Stable isotopomer analyses of soil-emitted N2O (δ15N, δ18Oand SP = 15N site preference within the linear N2O molecule) may help to distinguish N2O production pathways and to quantify N2O reduction to N2. Different N2O forming processes are characterised by distinct isotopic characteristics. Bacterial denitrification shows significantly lower SP and δ18Ovalues when compared to fungal denitrification and nitrification processes. But SP and δ18Ovalues are also altered during N2O reduction to N2, when the residual N2O is enriched in 18Oand centrally located 15N, resulting in increased δ18Oand SP values. Hence, the interpretation of these isotope characteristics is not straightforward, because higher δ18Oand SP values may be due to admixture of N2O from fungal denitrification or nitrification, or due to N2O reduction to N2. One of these processes, either admixture or reduction, can be quite well quantified if the other one is determined with independent methods. But usually both processes are unknown and the ability to estimate both of them simultaneously would be very beneficial. Here we present an attempt to determine both the admixture and the reduction simultaneously using the isotopomer mapping, i.e. the relation between δ18Oand SP. The measured sample points are typically situated between the two lines: reduction line with a typical slope of about 0.35 and mixing line with a higher slope of about 0.8. Combining the reduction and the mixing vector allows for the determination of both processes based on the location of the sample point between the lines. We tested this new approach for laboratory incubation studies, where a reference method for N2O reduction quantification was applied, i.e. 15N gas flux method or incubations in He atmosphere. This allowed us to check how well the calculated amounts for N2O reduction agree with the results provided by the reference method. The general trend was quite well reflected in our calculated results, however, quite

  10. Binding Ensemble PROfiling with (F)photoaffinity Labeling (BEProFL) Approach: Mapping the Binding Poses of HDAC8 Inhibitors

    PubMed Central

    He, Bai; Velaparthi, Subash; Pieffet, Gilles; Pennington, Chris; Mahesh, Aruna; Holzle, Denise L.; Brunsteiner, Michael; van Breemen, Richard; Blond, Sylvie Y.; Petukhov, Pavel A.

    2009-01-01

    A Binding Ensemble PROfiling with (F)photoaffinity Labeling (BEProFL) approach that utilizes photolabeling of HDAC8 with a probe containing a UV-activated aromatic azide, mapping the covalent modifications by liquid chromatography-tandem mass-spectrometry, and a computational method to characterize the multiple binding poses of the probe is described. Using the BEProFL approach two distinct binding poses of the HDAC8 probe were identified. The data also suggest that an “upside-down” pose with the surface binding group of the probe bound in an alternative pocket near the catalytic site may contribute to the binding. PMID:19886628

  11. Tagging b jets associated with heavy neutral MSSM Higgs bosons

    NASA Astrophysics Data System (ADS)

    Heikkinen, A.; Lehti, S.

    2006-04-01

    Since a neural network (NN) approach has been shown to be applicable to the problem of Higgs boson detection at LHC [I. Iashvili, A. Kharchilava, CMS TN-1996/100; M. Mjahed, Nucl. Phys. B 140 (2005) 799], we study the use of NNs in the problem of tagging b jets in pp →bb¯HSUSY, HSUSY→ττ in the Compact Muons Solenoid experiment [F. Hakl, et al., Nucl. Instr. and Meth. A 502 (2003) 489; S. Lehti, CMS NOTE-2001/019; G. Segneri, F. Palla, CMS NOTE-2002/046]. B tagging is an important tool for separating the Higgs events with associated b jets from the Drell-Yan background Z,γ*→ττ, for which the associated jets are mostly light quark and gluon jets. We teach multi-layer perceptrons (MLPs) available in the object oriented implementation of data analysis framework ROOT [ROOT—An Object Oriented Data Analysis Framework, in: Proceedings of the AIHENP'96 Workshop, Lausanne, September 1996, Nucl. Instr. and Meth. A 389 (1997) 81]. The following learning methods are evaluated: steepest descent algorithm, (BFGS) Broyden-Fletcher-Goldfarb-Shanno algorithm, and variants of conjugate gradients. The ROOT code generation feature of standalone C++ classifiers is utilized. We compare the b tagging performance of MLPs with another ROOT based feed forward NN tool NeuNet [J.P. Ernenwein, NeuNet software for ROOT], which uses a common back-propagation learning method. In addition, we demonstrate the use of the self-organizing map program package (SOM_PAK) and the learning vector quantization program package (LVQ_PAK) [T. Kohonen, et al., SOM_PAK: the self-organizing map program package, Technical Report A31; T. Kohonen, et al., LVQ_PAK: the learning vector quantization program package, Technical Report A30, Laboratory of Computer and Information Science, Helsinki University of Technology, FIN-02150 Espoo, Finland, 1996] in the b tagging problem.

  12. Groundwater prospective mapping: remote sensing and a GIS-based index model approach

    NASA Astrophysics Data System (ADS)

    Shetty, Amba; Nandagiri, Lakshman; Ramachandra, Padami

    2009-01-01

    The present study is concerned with the development and test of an integrated remote sensing and GIS based methodology for identification of groundwater potential areas in a humid tropical river basin. Indian Remote Sensing Satellite (IRS 1C-LISS-III) data along with other collateral data such as existing maps and field observations was utilized to extract information on the hydro-geomorphic features of the terrain. The study involves two components: (a) demarcation of groundwater potential zones (b) validation of sites with yield data. In order to demarcate potential groundwater zones, six pertinent thematic layers were integrated and assigned appropriate rankings. Layers considered were: geology, geomorphology, drainage density, slope, rainfall with infiltration factor and land cover map. The layer parameters were also rated according to their importance relative to other classes in the same theme. All the layers were superimposed and analyzed in ARC GIS environment. A linear additive model based on the DRASTIC model concept was used to find the groundwater potential index (GPI). The map comprised of six categories of groundwater yield. To carry out more focused investigations on the potential zones, lineament maps were superimposed over it. The validity of different potential zones identified using the GIS-based model was compared with available borewell yield data and found to be in good agreement. The map generated can be used in future as a preliminary screening tool in selecting well sites and as a basic tool in land use planning for groundwater protection.

  13. Remote sensing approach to map riparian vegetation of the Colorado River Ecosystem, Grand Canyon area, Arizona

    NASA Astrophysics Data System (ADS)

    Nguyen, U.; Glenn, E.; Nagler, P. L.; Sankey, J. B.

    2015-12-01

    Riparian zones in the southwestern U.S. are usually a mosaic of vegetation types at varying states of succession in response to past floods or droughts. Human impacts also affect riparian vegetation patterns. Human- induced changes include introduction of exotic species, diversion of water for human use, channelization of the river to protect property, and other land use changes that can lead to deterioration of the riparian ecosystem. This study explored the use of remote sensing to map an iconic stretch of the Colorado River in the Grand Canyon National Park, Arizona. The pre-dam riparian zone in the Grand Canyon was affected by annual floods from spring run-off from the watersheds of Green River, the Colorado River and the San Juan River. A pixel-based vegetation map of the riparian zone in the Grand Canyon, Arizona, was produced from high-resolution aerial imagery. The map was calibrated and validated with ground survey data. A seven-step image processing and classification procedure was developed based on a suite of vegetation indices and classification subroutines available in ENVI Image Processing and Analysis software. The result was a quantitative species level vegetation map that could be more accurate than the qualitative, polygon-based maps presently used on the Lower Colorado River. The dominant woody species in the Grand Canyon are now saltcedar, arrowweed and mesquite, reflecting stress-tolerant forms adapted to alternated flow regimes associated with the river regulation.

  14. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    PubMed

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students. PMID:26891960

  15. Segmentation and automated measurement of chronic wound images: probability map approach

    NASA Astrophysics Data System (ADS)

    Ahmad Fauzi, Mohammad Faizal; Khansa, Ibrahim; Catignani, Karen; Gordillo, Gayle; Sen, Chandan K.; Gurcan, Metin N.

    2014-03-01

    estimated 6.5 million patients in the United States are affected by chronic wounds, with more than 25 billion US dollars and countless hours spent annually for all aspects of chronic wound care. There is need to develop software tools to analyze wound images that characterize wound tissue composition, measure their size, and monitor changes over time. This process, when done manually, is time-consuming and subject to intra- and inter-reader variability. In this paper, we propose a method that can characterize chronic wounds containing granulation, slough and eschar tissues. First, we generate a Red-Yellow-Black-White (RYKW) probability map, which then guides the region growing segmentation process. The red, yellow and black probability maps are designed to handle the granulation, slough and eschar tissues, respectively found in wound tissues, while the white probability map is designed to detect the white label card for measurement calibration purpose. The innovative aspects of this work include: 1) Definition of a wound characteristics specific probability map for segmentation, 2) Computationally efficient regions growing on 4D map; 3) Auto-calibration of measurements with the content of the image. The method was applied on 30 wound images provided by the Ohio State University Wexner Medical Center, with the ground truth independently generated by the consensus of two clinicians. While the inter-reader agreement between the readers is 85.5%, the computer achieves an accuracy of 80%.

  16. A new integrated statistical approach to the diagnostic use of two-dimensional maps.

    PubMed

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio; Cecconi, Daniela; Domenici, Enrico

    2003-01-01

    Two-dimensional (2-D) electrophoresis is a very useful technique for the analysis of proteins in biological tissues. The complexity of the 2-D maps obtained causes many difficulties in the comparison of different samples. A new method is proposed for comparing different 2-D maps, based on five steps: (i) the digitalisation of the image; (ii) the transformation of the digitalised map in a fuzzy entity, in order to consider the variability of the 2-D electrophoretic separation; (iii) the calculation of a similarity index for each pair of maps; (iv) the analysis by multidimensional scaling of the previously obtained similarity matrix; (v) the analysis by classification or cluster analysis techniques of the resulting map co-ordinates. The method adopted was first tested on some simulated samples in order to evaluate its sensitivity to small changes in the spots position and size. The optimal setting of the method parameters was also investigated. Finally, the method was successfully applied to a series of real samples corresponding to the electrophoretic bidimensional analysis of sera from normal and nicotine-treated rats. Multidimensional scaling allowed the separation of the two classes of samples without any misclassification. PMID:12652595

  17. Impact of a concept map teaching approach on nursing students' critical thinking skills.

    PubMed

    Kaddoura, Mahmoud; Van-Dyke, Olga; Yang, Qing

    2016-09-01

    Nurses confront complex problems and decisions that require critical thinking in order to identify patient needs and implement best practices. An active strategy for teaching students the skills to think critically is the concept map. This study explores the development of critical thinking among nursing students in a required pathophysiology and pharmacology course during the first year of a Bachelor of Science in Nursing in response to concept mapping as an interventional strategy, using the Health Education Systems, Incorporated critical thinking test. A two-group experimental study with a pretest and posttest design was used. Participants were randomly divided into a control group (n = 42) taught by traditional didactic lecturing alone, and an intervention group (n = 41), taught by traditional didactic lecturing with concept mapping. Students in the concept mapping group performed much better on the Health Education Systems, Incorporated than students in the control group. It is recommended that deans, program directors, and nursing faculties evaluate their curricula to integrate concept map teaching strategies in courses in order to develop critical thinking abilities in their students.

  18. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  19. Proposal for Microwave Boson Sampling

    NASA Astrophysics Data System (ADS)

    Peropadre, Borja; Guerreschi, Gian Giacomo; Huh, Joonsuk; Aspuru-Guzik, Alán

    2016-09-01

    Boson sampling, the task of sampling the probability distribution of photons at the output of a photonic network, is believed to be hard for any classical device. Unlike other models of quantum computation that require thousands of qubits to outperform classical computers, boson sampling requires only a handful of single photons. However, a scalable implementation of boson sampling is missing. Here, we show how superconducting circuits provide such platform. Our proposal differs radically from traditional quantum-optical implementations: rather than injecting photons in waveguides, making them pass through optical elements like phase shifters and beam splitters, and finally detecting their output mode, we prepare the required multiphoton input state in a superconducting resonator array, control its dynamics via tunable and dispersive interactions, and measure it with nondemolition techniques.

  20. The Cancer Experience Map: An Approach to Including the Patient Voice in Supportive Care Solutions

    PubMed Central

    2015-01-01

    The perspective of the patient, also called the “patient voice”, is an essential element in materials created for cancer supportive care. Identifying that voice, however, can be a challenge for researchers and developers. A multidisciplinary team at a health information company tasked with addressing this issue created a representational model they call the “cancer experience map”. This map, designed as a tool for content developers, offers a window into the complex perspectives inside the cancer experience. Informed by actual patient quotes, the map shows common overall themes for cancer patients, concerns at key treatment points, strategies for patient engagement, and targeted behavioral goals. In this article, the team members share the process by which they created the map as well as its first use as a resource for cancer support videos. The article also addresses the broader policy implications of including the patient voice in supportive cancer content, particularly with regard to mHealth apps. PMID:26022846

  1. Complex approach to long-term multi-agent mapping in low dynamic environments

    NASA Astrophysics Data System (ADS)

    Shvets, Evgeny A.; Nikolaev, Dmitry P.

    2015-12-01

    In the paper we consider the problem of multi-agent continuous mapping of a changing, low dynamic environment. The mapping problem is a well-studied one, however usage of multiple agents and operation in a non-static environment complicate it and present a handful of challenges (e.g. double-counting, robust data association, memory and bandwidth limits). All these problems are interrelated, but are very rarely considered together, despite the fact that each has drawn attention of the researches. In this paper we devise an architecture that solves the considered problems in an internally consistent manner.

  2. A multi-method approach for benthic habitat mapping of shallow coastal areas with high-resolution multibeam data

    NASA Astrophysics Data System (ADS)

    Micallef, Aaron; Le Bas, Timothy P.; Huvenne, Veerle A. I.; Blondel, Philippe; Hühnerbach, Veit; Deidun, Alan

    2012-05-01

    The coastal waters of the Maltese Islands, central Mediterranean Sea, sustain a diversity of marine habitats and support a wide range of human activities. The islands' shallow waters are characterised by a paucity of hydrographic and marine geo-environmental data, which is problematic in view of the requirements of the Maltese Islands to assess the state of their coastal waters by 2012 as part of the EU Marine Strategy Directive. Multibeam echosounder (MBES) systems are today recognised as one of the most effective tools to map the seafloor, although the quantitative characterisation of MBES data for seafloor and habitat mapping is still an underdeveloped field. The purpose of this study is to outline a semi-automated, Geographic Information System-based methodology to map the distribution of habitats in shallow coastal waters using high-resolution MBES data. What distinguishes our methodology from those proposed in previous studies is the combination of a suite of geomorphometric and textural analytical techniques to map specific types of seafloor morphologies and compositions; the selection of the techniques is based on identifying which geophysical parameter would be influenced by the seabed type under consideration. We tested our approach in a 28 km2 area of Maltese coastal waters. Three data sets were collected from this study area: (i) MBES bathymetry and backscatter data; (ii) Remotely Operated Vehicle imagery and (iii) photographs and sediment samples from dive surveys. The seabed was classified into five elementary morphological zones and features - flat and sloping zones, crests, depressions and breaks of slope - using morphometric derivatives, the Bathymetric Position Index and geomorphometric mapping. Segmentation of the study area into seagrass-covered and unvegetated seafloor was based on roughness estimation. Further subdivision of these classes into the four predominant types of composition - medium sand, maërl associated with sand and gravel

  3. Exotic Gauge Bosons in the 331 Model

    SciTech Connect

    Romero, D.; Ravinez, O.; Diaz, H.; Reyes, J.

    2009-04-30

    We analize the bosonic sector of the 331 model which contains exotic leptons, quarks and bosons (E,J,U,V) in order to satisfy the weak gauge SU(3){sub L} invariance. We develop the Feynman rules of the entire kinetic bosonic sector which will let us to compute some of the Z(0)' decays modes.

  4. A physical approach of vulnerability mapping in karst area based on a new interpretation of tracer breakthrough curves

    NASA Astrophysics Data System (ADS)

    Bailly-Comte, V.; Pistre, S.

    2011-12-01

    Strategies for groundwater protection mostly use vulnerability maps to contamination; therefore, a lot of methods have been developed since the 90's with a particular attention to operational techniques. These easy-to-use methods are based on the superposition of relative rating systems applied to critical hydrogeological factors; their major drawback is the subjectivity of the determination of the rating scale and the weighting coefficients. Thus, in addition to vulnerability mapping, empirical results given by tracer tests are often needed to better assess groundwater vulnerability to accidental contamination in complex hydrosystems such as karst aquifers. This means that a lot of data about tracer breakthrough curves (BTC) in karst area are now available for hydrologists. In this context, we propose a physical approach to spatially distributed simulation of tracer BTC based on macrodispersive transport in 1D. A new interpretation of tracer tests performed in various media is shown as a validation of our theoretical development. The vulnerability map is then given by the properties of the simulated tracer BTC (modal time, mean residence time, duration over a given concentration threshold etc.). In this way, our method expresses the vulnerability with units, which makes it possible the comparison from one system to another. In addition, previous or new tracer tests can be used as a validation of the map for the same hydrological conditions. Even if this methodology is not limited to karsts hydrosystems, this seems particularly suitable for these complex environments for which understanding the origin of accidental contamination is crucial.

  5. An energy balance approach for mapping crop waterstress and yield impacts over the Czech Republic

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There is a growing demand for timely, spatially distributed information regarding crop condition and water use to inform agricultural decision making and yield forecasting efforts. Remote sensing of land-surface temperature has proven valuable for mapping evapotranspiration (ET) and crop stress from...

  6. An Innovative Approach to Scheme Learning Map Considering Tradeoff Multiple Objectives

    ERIC Educational Resources Information Center

    Lin, Yu-Shih; Chang, Yi-Chun; Chu, Chih-Ping

    2016-01-01

    An important issue in personalized learning is to provide learners with customized learning according to their learning characteristics. This paper focused attention on scheming learning map as follows. The learning goal can be achieved via different pathways based on alternative materials, which have the relationships of prerequisite, dependence,…

  7. Using Concept Maps to Assess Change in Teachers' Understandings of Algebra: A Respectful Approach

    ERIC Educational Resources Information Center

    Hough, Sarah; O'Rode, Nancy; Terman, Nancy; Weissglass, Julian

    2007-01-01

    The purpose of this study was to explore teachers' growth in understanding of algebra using concept maps. The study was set in the context of a five-year National Science Foundation funded teacher retention and renewal professional development project. In this project both beginning and experienced teachers are supported as they increase their…

  8. Putting vulnerability to climate change on the map: a review of approaches, benefits, and risks

    SciTech Connect

    Preston, Benjamin L

    2011-01-01

    There is growing demand among stakeholders across public and private institutions for spatially-explicit information regarding vulnerability to climate change at the local scale. However, the challenges associated with mapping the geography of climate change vulnerability are non-trivial, both conceptually and technically, suggesting the need for more critical evaluation of this practice. Here, we review climate change vulnerability mapping in the context of four key questions that are fundamental to assessment design. First, what are the goals of the assessment? A review of published assessments yields a range of objective statements that emphasize problem orientation or decision-making about adaptation actions. Second, how is the assessment of vulnerability framed? Assessments vary with respect to what values are assessed (vulnerability of what) and the underlying determinants of vulnerability that are considered (vulnerability to what). The selected frame ultimately influences perceptions of the primary driving forces of vulnerability as well as preferences regarding management alternatives. Third, what are the technical methods by which an assessment is conducted? The integration of vulnerability determinants into a common map remains an emergent and subjective practice associated with a number of methodological challenges. Fourth, who participates in the assessment and how will it be used to facilitate change? Assessments are often conducted under the auspices of benefiting stakeholders, yet many lack direct engagement with stakeholders. Each of these questions is reviewed in turn by drawing on an illustrative set of 45 vulnerability mapping studies appearing in the literature. A number of pathways for placing vulnerability

  9. Mapping a Mutation in "Caenorhabditis elegans" Using a Polymerase Chain Reaction-Based Approach

    ERIC Educational Resources Information Center

    Myers, Edith M.

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the "Caenorhabditis elegans" genome. SNPs present in the genomes of two isogenic "C. elegans" strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which…

  10. A Robust Approach for Mapping Group Marks to Individual Marks Using Peer Assessment

    ERIC Educational Resources Information Center

    Spatar, Ciprian; Penna, Nigel; Mills, Henny; Kutija, Vedrana; Cooke, Martin

    2015-01-01

    Group work can form a substantial component of degree programme assessments. To satisfy institutional and student expectations, students must often be assigned individual marks for their contributions to the group project, typically by mapping a single holistic group mark to individual marks using peer assessment scores. Since the early 1990s,…

  11. Effects of Concept Mapping Instruction Approach on Students' Achievement in Basic Science

    ERIC Educational Resources Information Center

    Ogonnaya, Ukpai Patricia; Okafor, Gabriel; Abonyi, Okechukwu S.; Ugama, J. O.

    2016-01-01

    The study investigated the effects of concept mapping on students' achievement in basic science. The study was carried out in Ebonyi State of Nigeria. The study employed a quasi-experimental design. Specifically the pretest posttest non-equivalent control group research design was used. The sample was 122 students selected from two secondary…

  12. Evaluation of data analytic approaches to generating cross-domain mappings of controlled science vocabularies

    NASA Astrophysics Data System (ADS)

    Zednik, S.

    2015-12-01

    Recent data publication practices have made increasing amounts of diverse datasets available online for the general research community to explore and integrate. Even with the abundance of data online, relevant data discovery and successful integration is still highly dependent upon the data being published with well-formed and understandable metadata. Tagging a dataset with well-known or controlled community terms is a common mechanism to indicate the intended purpose, subject matter, or other relevant facts of a dataset, however controlled domain terminology can be difficult for cross-domain researchers to interpret and leverage. It is also a challenge for integration portals to successfully provide cross-domain search capabilities over data holdings described using many different controlled vocabularies. Mappings between controlled vocabularies can be challenging because communities frequently develop specialized terminologies and have highly specific and contextual usages of common words. Despite this specificity it is highly desirable to produce cross-domain mappings to support data integration. In this contribution we evaluate the applicability of several data analytic techniques for the purpose of generating mappings between hierarchies of controlled science terms. We hope our efforts initiate more discussion on the topic and encourage future mapping efforts.

  13. Mapping Patterns of Perceptions: A Community-Based Approach to Cultural Competence Assessment

    ERIC Educational Resources Information Center

    Davis, Tamara S.

    2007-01-01

    Unclear definitions and limited system-level assessment measures inhibit cultural responsiveness in children's mental health. This study explores an alternative method to conceptualize and assess cultural competence in four children's mental health systems of care communities from family and professional perspectives. Concept Mapping was used to…

  14. Mapping the Druggable Allosteric Space of G-Protein Coupled Receptors: a Fragment-Based Molecular Dynamics Approach

    PubMed Central

    Ivetac, Anthony; Andrew McCammon, J

    2010-01-01

    To address the problem of specificity in G-protein coupled receptor (GPCR) drug discovery, there has been tremendous recent interest in allosteric drugs that bind at sites topographically distinct from the orthosteric site. Unfortunately, structure-based drug design of allosteric GPCR ligands has been frustrated by the paucity of structural data for allosteric binding sites, making a strong case for predictive computational methods. In this work, we map the surfaces of the β1 (β1AR) and β2 (β2AR) adrenergic receptor structures to detect a series of five potentially druggable allosteric sites. We employ the FTMAP algorithm to identify ‘hot spots’ with affinity for a variety of organic probe molecules corresponding to drug fragments. Our work is distinguished by an ensemble-based approach, whereby we map diverse receptor conformations taken from molecular dynamics (MD) simulations totaling approximately 0.5 μs. Our results reveal distinct pockets formed at both solvent-exposed and lipid-exposed cavities, which we interpret in light of experimental data and which may constitute novel targets for GPCR drug discovery. This mapping data can now serve to drive a combination of fragment-based and virtual screening approaches for the discovery of small molecules that bind at these sites and which may offer highly selective therapies. PMID:20626410

  15. A novel lidar-driven two-level approach for real-time unmanned ground vehicle navigation and map building

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Cui, Bo; Zhang, Xingzhong

    2013-12-01

    In this paper, a two-level LIDAR-driven hybrid approach is proposed for real-time unmanned ground vehicle navigation and map building. Top level is newly designed enhanced Voronoi Diagram (EVD) method to plan a global trajectory for an unmanned vehicle. Bottom level employs Vector Field Histogram (VFH) algorithm based on the LIDAR sensor information to locally guide the vehicle under complicated workspace, in which it autonomously traverses from one node to another within the planned EDV with obstacle avoidance. To find the least-cost path within the EDV, novel distance and angle based search heuristic algorithms are developed, in which the cost of an edge is the risk of traversing the edge. An EVD is first constructed based on the environment, which is utilized to generate the initial global trajectory with obstacle avoidance. The VFH algorithm is employed to guide the vehicle to follow the path locally. Its effectiveness and efficiency of real-time navigation and map building for unmanned vehicles have been successfully validated by simulation studies and experiments. The proposed approach is successfully experimented on an actual unmanned vehicle to demonstrate the real-time navigation and map building performance of the proposed method. The vehicle appears to follow a very stable path while navigating through various obstacles.

  16. A 'visual-centred' mapping approach for improving access to Web 2.0 for people with visual impairments.

    PubMed

    Jay, Caroline; Brown, Andrew; Harper, Simon

    2011-01-01

    On simple Web pages, the text to speech translation provided by a screen reader works relatively well. This is not the case for more sophisticated 'Web 2.0' pages, in which many interactive visual features, such as tickers, tabs, auto-suggest lists, calendars and slideshows currently remain inaccessible. Determining how to present these in audio is challenging in general, but may be particularly so for certain groups, such as people with congenital or early-onset blindness, as they are not necessarily familiar with the visual interaction metaphors that are involved. This article describes an evaluation of an audio Web browser designed using a novel approach, whereby visual content is translated to audio using algorithms derived from observing how sighted users interact with it. Both quantitative and qualitative measures showed that all participants, irrespective of the onset of their visual impairment, preferred the visual interaction-based audio mappings. Participants liked the fact that the mappings made the dynamic content truly accessible, rather than merely available to those who could find it, as is presently the case. The results indicate that this 'visual-centred' mapping approach may prove to be a suitable technique for translating complex visual content to audio, even for users with early-onset visual disabilities. PMID:20575753

  17. Near Real-time Visualization of the Coastal Ocean: A Google Maps Approach

    NASA Astrophysics Data System (ADS)

    Terrill, E.; Reuter, P.; Hazard, L.; Otero, M.; Cook, T.; Bowen, J.

    2009-12-01

    The Coastal Observing R&D Center (CORDC) at Scripps Institution of Oceanography has developed and implemented real-time data management and display tools for use in the Google Maps environment. A primary use of these tools is for displaying data measured, aggregated, and distributed by a regional observing system. CORDC developed and continues to maintain these tools that are now in use by a broad suite of end users, including local, state and federal agencies, resource managers, industry, policy makers, educators, scientists and the general public for the Southern California Coastal Ocean Observing System (SCCOOS). These data feeds encompass the ongoing monitoring of a broad suite of ocean observing data including, but not limited to: surface currents, satellite imagery, wave conditions and forecasts, meteorological conditions and forecasts, water quality, bathymetry, ocean temperature, salinity, chlorophyll, and density in the form of data products and raw data. By leveraging Google Maps, this effort has achieved seamless integration of disparate datasets into a unifying, low latency interface for on-line visualization and interaction. The resulting interfaces have brought national attention to the public display of data that SCCOOS serves, notably for ease of use and navigation. While the Google Maps API provides basic capabilities for spatially zooming and panning, developers are able to extend the API to include customized temporal browsing of spatial maps, info. displays, legends, dynamic color scaling and interactive data queries resulting in time series or other point/slice plots. Visualizations of products with more mature Google Maps interfaces will be presented, including statewide ocean surface currents, meteorological observations, ship tracking and output from an operational ocean current models that pose additional challenges of 4D data sets. Current developments involving product layering and API extensions will also be presented.

  18. A Karnaugh map based approach towards systemic reviews and meta-analysis.

    PubMed

    Hassan, Abdul Wahab; Hassan, Ahmad Kamal

    2016-01-01

    Studying meta-analysis and systemic reviews since long had helped us conclude numerous parallel or conflicting studies. Existing studies are presented in tabulated forms which contain appropriate information for specific cases yet it is difficult to visualize. On meta-analysis of data, this can lead to absorption and subsumption errors henceforth having undesirable potential of consecutive misunderstandings in social and operational methodologies. The purpose of this study is to investigate an alternate forum for meta-data presentation that relies on humans' strong pictorial perception capability. Analysis of big-data is assumed to be a complex and daunting task often reserved on the computational powers of machines yet there exist mapping tools which can analyze such data in a hand-handled manner. Data analysis on such scale can benefit from the use of statistical tools like Karnaugh maps where all studies can be put together on a graph based mapping. Such a formulation can lead to more control in observing patterns of research community and analyzing further for uncertainty and reliability metrics. We present a methodological process of converting a well-established study in Health care to its equaling binary representation followed by furnishing values on to a Karnaugh Map. The data used for the studies presented herein is from Burns et al (J Publ Health 34(1):138-148, 2011) consisting of retrospectively collected data sets from various studies on clinical coding data accuracy. Using a customized filtration process, a total of 25 studies were selected for review with no, partial, or complete knowledge of six independent variables thus forming 64 independent cells on a Karnaugh map. The study concluded that this pictorial graphing as expected had helped in simplifying the overview of meta-analysis and systemic reviews. PMID:27064957

  19. Geospatial Predictive Modelling for Climate Mapping of Selected Severe Weather Phenomena Over Poland: A Methodological Approach

    NASA Astrophysics Data System (ADS)

    Walawender, Ewelina; Walawender, Jakub P.; Ustrnul, Zbigniew

    2016-02-01

    The main purpose of the study is to introduce methods for mapping the spatial distribution of the occurrence of selected atmospheric phenomena (thunderstorms, fog, glaze and rime) over Poland from 1966 to 2010 (45 years). Limited in situ observations as well the discontinuous and location-dependent nature of these phenomena make traditional interpolation inappropriate. Spatially continuous maps were created with the use of geospatial predictive modelling techniques. For each given phenomenon, an algorithm identifying its favourable meteorological and environmental conditions was created on the basis of observations recorded at 61 weather stations in Poland. Annual frequency maps presenting the probability of a day with a thunderstorm, fog, glaze or rime were created with the use of a modelled, gridded dataset by implementing predefined algorithms. Relevant explanatory variables were derived from NCEP/NCAR reanalysis and downscaled with the use of a Regional Climate Model. The resulting maps of favourable meteorological conditions were found to be valuable and representative on the country scale but at different correlation (r) strength against in situ data (from r = 0.84 for thunderstorms to r = 0.15 for fog). A weak correlation between gridded estimates of fog occurrence and observations data indicated the very local nature of this phenomenon. For this reason, additional environmental predictors of fog occurrence were also examined. Topographic parameters derived from the SRTM elevation model and reclassified CORINE Land Cover data were used as the external, explanatory variables for the multiple linear regression kriging used to obtain the final map. The regression model explained 89 % of annual frequency of fog variability in the study area. Regression residuals were interpolated via simple kriging.

  20. Gaussian Multiple Instance Learning Approach for Mapping the Slums of the World Using Very High Resolution Imagery

    SciTech Connect

    Vatsavai, Raju

    2013-01-01

    In this paper, we present a computationally efficient algo- rithm based on multiple instance learning for mapping infor- mal settlements (slums) using very high-resolution remote sensing imagery. From remote sensing perspective, infor- mal settlements share unique spatial characteristics that dis- tinguish them from other urban structures like industrial, commercial, and formal residential settlements. However, regular pattern recognition and machine learning methods, which are predominantly single-instance or per-pixel classi- fiers, often fail to accurately map the informal settlements as they do not capture the complex spatial patterns. To overcome these limitations we employed a multiple instance based machine learning approach, where groups of contigu- ous pixels (image patches) are modeled as generated by a Gaussian distribution. We have conducted several experi- ments on very high-resolution satellite imagery, represent- ing four unique geographic regions across the world. Our method showed consistent improvement in accurately iden- tifying informal settlements.

  1. A Monte Carlo simulation based two-stage adaptive resonance theory mapping approach for offshore oil spill vulnerability index classification.

    PubMed

    Li, Pu; Chen, Bing; Li, Zelin; Zheng, Xiao; Wu, Hongjing; Jing, Liang; Lee, Kenneth

    2014-09-15

    In this paper, a Monte Carlo simulation based two-stage adaptive resonance theory mapping (MC-TSAM) model was developed to classify a given site into distinguished zones representing different levels of offshore Oil Spill Vulnerability Index (OSVI). It consisted of an adaptive resonance theory (ART) module, an ART Mapping module, and a centroid determination module. Monte Carlo simulation was integrated with the TSAM approach to address uncertainties that widely exist in site conditions. The applicability of the proposed model was validated by classifying a large coastal area, which was surrounded by potential oil spill sources, based on 12 features. Statistical analysis of the results indicated that the classification process was affected by multiple features instead of one single feature. The classification results also provided the least or desired number of zones which can sufficiently represent the levels of offshore OSVI in an area under uncertainty and complexity, saving time and budget in spill monitoring and response. PMID:25044043

  2. Dynamics of open bosonic quantum systems in coherent state representation

    SciTech Connect

    Dalvit, D. A. R.; Berman, G. P.; Vishik, M.

    2006-01-15

    We consider the problem of decoherence and relaxation of open bosonic quantum systems from a perspective alternative to the standard master equation or quantum trajectories approaches. Our method is based on the dynamics of expectation values of observables evaluated in a coherent state representation. We examine a model of a quantum nonlinear oscillator with a density-density interaction with a collection of environmental oscillators at finite temperature. We derive the exact solution for dynamics of observables and demonstrate a consistent perturbation approach.

  3. Molecular Property eXplorer: a novel approach to visualizing SAR using tree-maps and heatmaps.

    PubMed

    Kibbey, Christopher; Calvet, Alain

    2005-01-01

    The tremendous increase in chemical structure and biological activity data brought about through combinatorial chemistry and high-throughput screening technologies has created the need for sophisticated graphical tools for visualizing and exploring structure-activity data. Visualization plays an important role in exploring and understanding relationships within such multidimensional data sets. Many chemoinformatics software applications apply standard clustering techniques to organize structure-activity data, but they differ significantly in their approaches to visualizing clustered data. Molecular Property eXplorer (MPX) is unique in its presentation of clustered data in the form of heatmaps and tree-maps. MPX employs agglomerative hierarchical clustering to organize data on the basis of the similarity between 2D chemical structures or similarity across a predefined profile of biological assay values. Visualization of hierarchical clusters as tree-maps and heatmaps provides simultaneous representation of cluster members along with their associated assay values. Tree-maps convey both the spatial relationship among cluster members and the value of a single property (activity) associated with each member. Heatmaps provide visualization of the cluster members across an activity profile. Unlike a tree-map, however, a heatmap does not convey the spatial relationship between cluster members. MPX seamlessly integrates tree-maps and heatmaps to represent multidimensional structure-activity data in a visually intuitive manner. In addition, MPX provides tools for clustering data on the basis of chemical structure or activity profile, displaying 2D chemical structures, and querying the data based over a specified activity range, or set of chemical structure criteria (e.g., Tanimoto similarity, substructure match, and "R-group" analysis).

  4. A spatial, statistical approach to map the risk of milk contamination by β-hexachlorocyclohexane in dairy farms.

    PubMed

    Battisti, Sabrina; Caminiti, Antonino; Ciotoli, Giancarlo; Panetta, Valentina; Rombolà, Pasquale; Sala, Marcello; Ubaldi, Alessandro; Scaramozzino, Paola

    2013-11-01

    In May 2005, beta-hexachlorocyclohexane (β-HCH) was found in a sample of bovine bulk milk from a farm in the Sacco River valley (Latium region, central Italy). The primary source of contamination was suspected to be industrial discharge into the environment with the Sacco River as the main mean of dispersion. Since then, a surveillance programme on bulk milk of the local farms was carried out by the veterinary services. In order to estimate the spatial probability of β- HCH contamination of milk produced in the Sacco River valley and draw probability maps of contamination, probability maps of β-HCH values in milk were estimated by indicator kriging (IK), a geo-statistical estimator, and traditional logistic regression (LR) combined with a geographical information systems approach. The former technique produces a spatial view of probabilities above a specific threshold at non-sampled locations on the basis of observed values in the area, while LR gives the probabilities in specific locations on the basis of certain environmental predictors, namely the distance from the river, the distance from the pollution site, the elevation above the river level and the intrinsic vulnerability of hydro-geological formations. Based on the β-HCH data from 2005 in the Sacco River valley, the two techniques resulted in similar maps of high risk of milk contamination. However, unlike the IK method, the LR model was capable of estimating coefficients that could be used in case of future pollution episodes. The approach presented produces probability maps and define high-risk areas already in the early stages of an emergency before sampling operations have been carried out.

  5. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    PubMed

    Lawrence, Emma; Hayes, Keith R; Lucieer, Vanessa L; Nichol, Scott L; Dambacher, Jeffrey M; Hill, Nicole A; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN zone IV, and in

  6. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach

    PubMed Central

    Lawrence, Emma; Hayes, Keith R.; Lucieer, Vanessa L.; Nichol, Scott L.; Dambacher, Jeffrey M.; Hill, Nicole A.; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve’s IUCN zone IV, and

  7. Mapping Habitats and Developing Baselines in Offshore Marine Reserves with Little Prior Knowledge: A Critical Evaluation of a New Approach.

    PubMed

    Lawrence, Emma; Hayes, Keith R; Lucieer, Vanessa L; Nichol, Scott L; Dambacher, Jeffrey M; Hill, Nicole A; Barrett, Neville; Kool, Johnathan; Siwabessy, Justy

    2015-01-01

    The recently declared Australian Commonwealth Marine Reserve (CMR) Network covers a total of 3.1 million km2 of continental shelf, slope, and abyssal habitat. Managing and conserving the biodiversity values within this network requires knowledge of the physical and biological assets that lie within its boundaries. Unfortunately very little is known about the habitats and biological assemblages of the continental shelf within the network, where diversity is richest and anthropogenic pressures are greatest. Effective management of the CMR estate into the future requires this knowledge gap to be filled efficiently and quantitatively. The challenge is particularly great for the shelf as multibeam echosounder (MBES) mapping, a key tool for identifying and quantifying habitat distribution, is time consuming in shallow depths, so full coverage mapping of the CMR shelf assets is unrealistic in the medium-term. Here we report on the results of a study undertaken in the Flinders Commonwealth Marine Reserve (southeast Australia) designed to test the benefits of two approaches to characterising shelf habitats: (i) MBES mapping of a continuous (~30 km2) area selected on the basis of its potential to include a range of seabed habitats that are potentially representative of the wider area, versus; (ii) a novel approach that uses targeted mapping of a greater number of smaller, but spatially balanced, locations using a Generalized Random Tessellation Stratified sample design. We present the first quantitative estimates of habitat type and sessile biological communities on the shelf of the Flinders reserve, the former based on three MBES analysis techniques. We contrast the quality of information that both survey approaches offer in combination with the three MBES analysis methods. The GRTS approach enables design based estimates of habitat types and sessile communities and also identifies potential biodiversity hotspots in the northwest corner of the reserve's IUCN zone IV, and in

  8. A symbiotic approach to SETI observations: use of maps from the Westerbork Synthesis Radio Telescope

    NASA Technical Reports Server (NTRS)

    Tarter, J. C.; Israel, F. P.

    1982-01-01

    High spatial resolution continuum radio maps produced by the Westerbork Synthesis Radio Telescope (WSRT) of The Netherlands at frequencies near the 21 cm HI line have been examined for anomalous sources of emmission coincident with the locations of nearby bright stars. From a total of 542 stellar positions investigated, no candidates for radio stars or ETI signals were discovered to formal limits on the minimum detectable signal ranging from 7.7 x 10(-22) W/m2 to 6.4 x 10(-24) W/m2. This preliminary study has verified that data collected by radio astronomers at large synthesis arrays can profitably be analysed for SETI signals (in a non-interfering manner) provided only that the data are available in the form of a more or less standard two dimensional map format.

  9. A symbiotic approach to SETI observations: use of maps from the Westerbork Synthesis Radio Telescope.

    PubMed

    Tarter, J C; Israel, F P

    1982-01-01

    High spatial resolution continuum radio maps produced by the Westerbork Synthesis Radio Telescope (WSRT) of The Netherlands at frequencies near the 21 cm HI line have been examined for anomalous sources of emmission coincident with the locations of nearby bright stars. From a total of 542 stellar positions investigated, no candidates for radio stars or ETI signals were discovered to formal limits on the minimum detectable signal ranging from 7.7 x 10(-22) W/m2 to 6.4 x 10(-24) W/m2. This preliminary study has verified that data collected by radio astronomers at large synthesis arrays can profitably be analysed for SETI signals (in a non-interfering manner) provided only that the data are available in the form of a more or less standard two dimensional map format.

  10. Mapping a mutation in Caenorhabditis elegans using a polymerase chain reaction-based approach.

    PubMed

    Myers, Edith M

    2014-01-01

    Many single nucleotide polymorphisms (SNPs) have been identified within the Caenorhabditis elegans genome. SNPs present in the genomes of two isogenic C. elegans strains have been routinely used as a tool in forward genetics to map a mutation to a particular chromosome. This article describes a laboratory exercise in which undergraduate students use molecular biological techniques to map a mutation to a chromosome using a set of SNPs. Through this multi-week exercise, students perform genetic crosses, DNA extraction, polymerase chain reaction, restriction enzyme digests, agarose gel electrophoresis, and analysis of restriction fragment length polymorphisms. Students then analyze their results to deduce the chromosomal location of the mutation. Students also use bioinformatics websites to develop hypotheses that link the genotype to the phenotype. PMID:24615818

  11. Weather maps classification over Greek domain based on isobaric line patterns. A pattern recognition approach

    NASA Astrophysics Data System (ADS)

    Zagouras, Athanassios; Argiriou, Athanassios A.; Economou, George; Fotopoulos, Spiros; Flocas, Helena A.

    2013-11-01

    The paper presents a semi-supervised weather classification method based on 850-hPa isobaric level maps. A preprocessing step is employed, where isolines of geopotential height are extracted from weather map images via an image processing procedure. Α feature extraction stage follows where two techniques are applied. The first technique implements phase space reconstruction, and yields multidimensional delay distributions. The second technique is based on chain code representation of signals, from which histogram features are derived. Similarity measures are used to compare multidimensional data and the k-means algorithm is applied in the final stage. The method is applied over the area of Greece, and the resulting catalogues are compared to a subjective classification for this area. Numerical experiments with datasets derived from the European Meteorological Bulletin archives exhibit an up to 91 % accurate agreement with the subjective weather patterns.

  12. Mass Movement Susceptibility in the Western San Juan Mountains, Colorado: A Preliminary 3-D Mapping Approach

    NASA Astrophysics Data System (ADS)

    Kelkar, K. A.; Giardino, J. R.

    2015-12-01

    Mass movement is a major activity that impacts lives of humans and their infrastructure. Human activity in steep, mountainous regions is especially at risk to this potential hazard. Thus, the identification and quantification of risk by mapping and determining mass movement susceptibility are fundamental in protecting lives, resources and ensuring proper land use regulation and planning. Specific mass-movement processes including debris flows, rock falls, snow avalanches and landslides continuously modify the landscape of the San Juan Mountains. Historically, large-magnitude slope failures have repeatedly occurred in the region. Common triggers include intense, long-duration precipitation, freeze-thaw processes, human activity and various volcanic lithologies overlying weaker sedimentary formations. Predicting mass movement is challenging because of its episodic and spatially, discontinuous occurrence. Landslides in mountain terrain are characterized as widespread, highly mobile and have a long duration of activity. We developed a 3-D model for landslide susceptibility using Geographic Information Systems Technology (GIST). The study area encompasses eight USGS quadrangles: Ridgway, Dallas, Mount Sneffels, Ouray, Telluride, Ironton, Ophir and Silverton. Fieldwork consisted of field reconnaissance mapping at 1:5,000 focusing on surficial geomorphology. Field mapping was used to identify potential locations, which then received additional onsite investigation and photographic documentation of features indicative of slope failure. A GIS module was created using seven terrain spatial databases: geology, surficial geomorphology (digitized), slope aspect, slope angle, vegetation, soils and distance to infrastructure to map risk. The GIS database will help determine risk zonation for the study area. Correlations between terrain parameters leading to slope failure were determined through the GIS module. This 3-D model will provide a spatial perspective of the landscape to

  13. Mapping the nursing process: a new approach for understanding the work of nursing.

    PubMed

    Potter, Patricia; Boxerman, Stuart; Wolf, Laurie; Marshall, Jessica; Grayson, Deborah; Sledge, Jennifer; Evanoff, Bradley

    2004-02-01

    The work of nursing is nonlinear and involves complex reasoning and clinical decision making. The use of human factors engineering (HFE) as a sole means for analyzing the work of nursing is problematic. Combining HFE analysis with qualitative observation has created a new methodology for mapping the nursing process. A cognitive pathway offers a new perspective for understanding the work of nursing and analyzing how disruptions to the nursing process may contribute to errors in the acute care environment.

  14. A Machine Learning Approach to Mapping Agricultural Fields Across Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Debats, S. R.; Fuchs, T. J.; Thompson, D. R.; Estes, L. D.; Evans, T. P.; Caylor, K. K.

    2013-12-01

    Food production in sub-Saharan Africa is dominated by smallholder agriculture. Rainfed farming practices and the prevailing dryland conditions render crop yields vulnerable to increasing climatic variability. As a result, smallholder farmers are among the poorest and most food insecure groups among the region's population. Quantifying the distribution of smallholder agriculture across sub-Saharan Africa would greatly assist efforts to boost food security. Existing agricultural land cover data sets are limited to estimating the percentage of cropland within a coarse grid cell. The goal of this research is to develop a statistical machine learning algorithm to map individual agricultural fields, mirroring the accuracy of hand-digitization. For the algorithm, a random forest pixel-wise classifier learns by example from training data to distinguish between fields and non-fields. The algorithm then applies this training to classify previously unseen data. These classifications can then be smoothed into coherent regions corresponding to agricultural fields. Our training data set consists of hand-digitized boundaries of agricultural fields in South Africa, commissioned by its government in 2008. Working with 1 km x 1 km scenes across South Africa, the hand-digitized field boundaries are matched with satellite images extracted from Google Maps. To highlight different information contained within the images, several image processing filters are applied. The inclusion of Landsat images for additional training information is also explored. After training and testing the algorithm in South Africa, we aim to expand our mapping efforts across sub-Saharan Africa. Through Princeton's Mapping Africa project, crowdsourcing will produce additional training data sets of hand-digitized field boundaries in new areas of interest. This algorithm and the resulting data sets will provide previously unavailable information at an unprecedented level of detail on the largest and most

  15. Classical mapping for Hubbard operators: application to the double-Anderson model.

    PubMed

    Li, Bin; Miller, William H; Levy, Tal J; Rabani, Eran

    2014-05-28

    A classical Cartesian mapping for Hubbard operators is developed to describe the nonequilibrium transport of an open quantum system with many electrons. The mapping of the Hubbard operators representing the many-body Hamiltonian is derived by using analogies from classical mappings of boson creation and annihilation operators vis-à-vis a coherent state representation. The approach provides qualitative results for a double quantum dot array (double Anderson impurity model) coupled to fermionic leads for a range of bias voltages, Coulomb couplings, and hopping terms. While the width and height of the conduction peaks show deviations from the master equation approach considered to be accurate in the limit of weak system-leads couplings and high temperatures, the Hubbard mapping captures all transport channels involving transition between many electron states, some of which are not captured by approximate nonequilibrium Green function closures.

  16. Assessment of Ice Shape Roughness Using a Self-Orgainizing Map Approach

    NASA Technical Reports Server (NTRS)

    Mcclain, Stephen T.; Kreeger, Richard E.

    2013-01-01

    Self-organizing maps are neural-network techniques for representing noisy, multidimensional data aligned along a lower-dimensional and nonlinear manifold. For a large set of noisy data, each element of a finite set of codebook vectors is iteratively moved in the direction of the data closest to the winner codebook vector. Through successive iterations, the codebook vectors begin to align with the trends of the higher-dimensional data. Prior investigations of ice shapes have focused on using self-organizing maps to characterize mean ice forms. The Icing Research Branch has recently acquired a high resolution three dimensional scanner system capable of resolving ice shape surface roughness. A method is presented for the evaluation of surface roughness variations using high-resolution surface scans based on a self-organizing map representation of the mean ice shape. The new method is demonstrated for 1) an 18-in. NACA 23012 airfoil 2 AOA just after the initial ice coverage of the leading 5 of the suction surface of the airfoil, 2) a 21-in. NACA 0012 at 0AOA following coverage of the leading 10 of the airfoil surface, and 3) a cold-soaked 21-in.NACA 0012 airfoil without ice. The SOM method resulted in descriptions of the statistical coverage limits and a quantitative representation of early stages of ice roughness formation on the airfoils. Limitations of the SOM method are explored, and the uncertainty limits of the method are investigated using the non-iced NACA 0012 airfoil measurements.

  17. How can yeast cells decide between three activated MAP kinase pathways? A model approach.

    PubMed

    Rensing, Ludger; Ruoff, Peter

    2009-04-21

    In yeast (Saccharomyces cerevisiae), the regulation of three MAP kinase pathways responding to pheromones (Fus3 pathway), carbon/nitrogen starvation (Kss1 pathway), and high osmolarity/osmotic stress (Hog1 pathway) is the subject of intensive research. We were interested in the question how yeast cells would respond when more than one of the MAP kinase pathways are activated simultaneously. Here, we give a brief overview over the regulatory mechanisms of the yeast MAP kinase pathways and investigate a kinetic model based on presently known molecular interactions and feedbacks within and between the three mitogen-activated protein kinases (MAPK) pathways. When two pathways are activated simultaneously with the osmotic stress response as one of them, the model predicts that the osmotic stress response (Hog1 pathway) is turned on first. The same is true when all three pathways are activated at the same time. When testing simultaneous stimulations by low nitrogen and pheromones through the Kss1 and Fus3 pathways, respectively, the low nitrogen response dominates over the pheromone response. Due to its autocatalytic activation mechanism, the pheromone response (Fus3 pathway) shows typical sigmoid response kinetics and excitability. In the presence of a small but sufficient amount of activated Fus3, a stimulation by pheromones will lead to a rapid self-amplification of the pheromone response. This 'excitability' appears to be a feature of the pheromone pathway that has specific biological significance. PMID:19322936

  18. Compression map, functional groups and fossilization: A chemometric approach (Pennsylvanian neuropteroid foliage, Canada)

    USGS Publications Warehouse

    D'Angelo, J. A.; Zodrow, E.L.; Mastalerz, Maria

    2012-01-01

    Nearly all of the spectrochemical studies involving Carboniferous foliage of seed-ferns are based on a limited number of pinnules, mainly compressions. In contrast, in this paper we illustrate working with a larger pinnate segment, i.e., a 22-cm long neuropteroid specimen, compression-preserved with cuticle, the compression map. The objective is to study preservation variability on a larger scale, where observation of transparency/opacity of constituent pinnules is used as a first approximation for assessing the degree of pinnule coalification/fossilization. Spectrochemical methods by Fourier transform infrared spectrometry furnish semi-quantitative data for principal component analysis.The compression map shows a high degree of preservation variability, which ranges from comparatively more coalified pinnules to less coalified pinnules that resemble fossilized-cuticles, noting that the pinnule midveins are preserved more like fossilized-cuticles. A general overall trend of coalified pinnules towards fossilized-cuticles, i.e., variable chemistry, is inferred from the semi-quantitative FTIR data as higher contents of aromatic compounds occur in the visually more opaque upper location of the compression map. The latter also shows a higher condensation of the aromatic nuclei along with some variation in both ring size and degree of aromatic substitution. From principal component analysis we infer correspondence between transparency/opacity observation and chemical information which correlate with varying degree to fossilization/coalification among pinnules. ?? 2011 Elsevier B.V.

  19. Approaches of National 3d Mapping: Research Results and Standardisation in Practice

    NASA Astrophysics Data System (ADS)

    Stoter, J. E.; Streilein, A.; Pla, M.; Baella, B.; Capstick, D.; Home, R.; Roensdorf, C.; Lagrange, J. P.

    2013-09-01

    Over the past ten years technologies for generating, maintaining and using 3D geo-information have matured. For national mapping agencies one of the challenges is how to best extend 2D data into 3D data, making best use of research results and available technologies. Some mapping organisations are making serious progress. The question addressed in this paper is how research results achieved in the past ten years are applied in practice and what research problems remain. In addition, the paper explores the potentials of the OGC 3D standard (i.e. CityGML) for 3D national mapping and what developments are further required to make the standard better fit for this purpose. The main conclusions of the paper are that 3D data is more and more available but still suffers from a low level of usage (mainly visualisation) and standards and formats based on CityGML have been stabilised although software support is still in the early stage. Several recommendations are made to meet these problems, including the definition of European CityGML profiles (as the INSPIRE Building profile) to harmonise 3D needs and standardise 3D implementations at international level.

  20. a Virtual Hub Brokering Approach for Integration of Historical and Modern Maps

    NASA Astrophysics Data System (ADS)

    Bruno, N.; Previtali, M.; Barazzetti, L.; Brumana, R.; Roncella, R.

    2016-06-01

    Geospatial data are today more and more widespread. Many different institutions, such as Geographical Institutes, Public Administrations, collaborative communities (e.g., OSM) and web companies, make available nowadays a large number of maps. Besides this cartography, projects of digitizing, georeferencing and web publication of historical maps have increasingly spread in the recent years. In spite of these variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required and problems of interconnection between different data sources and their restricted interoperability limit a wide utilization of available geo-data. This paper aims to describe some actions performed to assure interoperability between data, in particular spatial and geographic data, gathered from different data providers, with different features and referring to different historical periods. The article summarizes and exemplifies how, starting from projects of historical map digitizing and Historical GIS implementation, respectively for the Lombardy and for the city of Parma, the interoperability is possible in the framework of the ENERGIC OD project. The European project ENERGIC OD, thanks to a specific component - the virtual hub - based on a brokering framework, copes with the previous listed problems and allows the interoperability between different data sources.

  1. An Approach to Optimize Size Parameters of Forging by Combining Hot-Processing Map and FEM

    NASA Astrophysics Data System (ADS)

    Hu, H. E.; Wang, X. Y.; Deng, L.

    2014-11-01

    The size parameters of 6061 aluminum alloy rib-web forging were optimized by using hot-processing map and finite element method (FEM) based on high-temperature compression data. The results show that the stress level of the alloy can be represented by a Zener-Holloman parameter in a hyperbolic sine-type equation with the hot deformation activation energy of 343.7 kJ/mol. Dynamic recovery and dynamic recrystallization concurrently preceded during high-temperature deformation of the alloy. Optimal hot-processing parameters for the alloy corresponding to the peak value of 0.42 are 753 K and 0.001 s-1. The instability domain occurs at deformation temperature lower than 653 K. FEM is an available method to validate hot-processing map in actual manufacture by analyzing the effect of corner radius, rib width, and web thickness on workability of rib-web forging of the alloy. Size parameters of die forgings can be optimized conveniently by combining hot-processing map and FEM.

  2. The sensitivity of the Higgs boson branching ratios to the W boson width

    NASA Astrophysics Data System (ADS)

    Murray, William

    2016-07-01

    The Higgs boson branching ratio into vector bosons is sensitive to the decay widths of those vector bosons because they are produced with at least one boson significantly off-shell. Γ (H → VV) is approximately proportional to the product of the Higgs boson coupling and the vector boson width. ΓZ is well measured, but ΓW gives an uncertainty on Γ (H → WW) which is not negligible. The ratio of branching ratios, BR (H → WW) / BR (H → ZZ) measured by a combination of ATLAS and CMS at LHC is used herein to extract a width for the W boson of ΓW =1.8-0.3+0.4 GeV by assuming Standard Model couplings of the Higgs bosons. This dependence of the branching ratio on ΓW is not discussed in most Higgs boson coupling analyses.

  3. Zoo of Quantum Phases and Excitations of Cold Bosonic Atoms in Optical Lattices

    SciTech Connect

    Alon, Ofir E.; Streltsov, Alexej I.; Cederbaum, Lorenz S.

    2005-07-15

    Quantum phases and phase transitions of weakly to strongly interacting bosonic atoms in deep to shallow optical lattices are described by a single multiorbital mean-field approach in real space. For weakly interacting bosons in one dimension, the critical value of the superfluid to Mott insulator (MI) transition found is in excellent agreement with many-body treatments of the Bose-Hubbard model. For strongly interacting bosons (i) additional MI phases appear, for which two (or more) atoms residing in each site undergo a Tonks-Girardeau-like transition and localize, and (ii) on-site excitation becomes the excitation lowest in energy. Experimental implications are discussed.

  4. A Concept Map Approach to Developing Collaborative Mindtools for Context-Aware Ubiquitous Learning

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Shi, Yen-Ru; Chu, Hui-Chun

    2011-01-01

    Recent advances in mobile and wireless communication technologies have enabled various new learning approaches which situate students in environments that combine real-world and digital-world learning resources; moreover, students are allowed to share knowledge or experiences with others during the learning process. Although such an approach seems…

  5. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    PubMed

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not.

  6. A Quantitative Visual Mapping and Visualization Approach for Deep Ocean Floor Research

    NASA Astrophysics Data System (ADS)

    Hansteen, T. H.; Kwasnitschka, T.

    2013-12-01

    Geological fieldwork on the sea floor is still impaired by our inability to resolve features on a sub-meter scale resolution in a quantifiable reference frame and over an area large enough to reveal the context of local observations. In order to overcome these issues, we have developed an integrated workflow of visual mapping techniques leading to georeferenced data sets which we examine using state-of-the-art visualization technology to recreate an effective working style of field geology. We demonstrate a microbathymetrical workflow, which is based on photogrammetric reconstruction of ROV imagery referenced to the acoustic vehicle track. The advantage over established acoustical systems lies in the true three-dimensionality of the data as opposed to the perspective projection from above produced by downward looking mapping methods. A full color texture mosaic derived from the imagery allows studies at resolutions beyond the resolved geometry (usually one order of magnitude below the image resolution) while color gives additional clues, which can only be partly resolved in acoustic backscatter. The creation of a three-dimensional model changes the working style from the temporal domain of a video recording back to the spatial domain of a map. We examine these datasets using a custom developed immersive virtual visualization environment. The ARENA (Artificial Research Environment for Networked Analysis) features a (lower) hemispherical screen at a diameter of six meters, accommodating up to four scientists at once thus providing the ability to browse data interactively among a group of researchers. This environment facilitates (1) the development of spatial understanding analogue to on-land outcrop studies, (2) quantitative observations of seafloor morphology and physical parameters of its deposits, (3) more effective formulation and communication of working hypotheses.

  7. Approaches to vegetation mapping and ecophysiological hypothesis testing using combined information from TIMS, AVIRIS, and AIRSAR

    NASA Technical Reports Server (NTRS)

    Oren, R.; Vane, G.; Zimmermann, R.; Carrere, V.; Realmuto, V.; Zebker, Howard A.; Schoeneberger, P.; Schoeneberger, M.

    1991-01-01

    The Tropical Rainforest Ecology Experiment (TREE) had two primary objectives: (1) to design a method for mapping vegetation in tropical regions using remote sensing and determine whether the result improves on available vegetation maps; and (2) to test a specific hypothesis on plant/water relations. Both objectives were thought achievable with the combined information from the Thermal Infrared Multispectral Scanner (TIMS), Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), and Airborne Synthetic Aperture Radar (AIRSAR). Implicitly, two additional objectives were: (1) to ascertain that the range within each variable potentially measurable with the three instruments is large enough in the site, relative to the sensitivity of the instruments, so that differences between ecological groups may be detectable; and (2) to determine the ability of the three systems to quantify different variables and sensitivities. We found that the ranges in values of foliar nitrogen concentration, water availability, stand structure and species composition, and plant/water relations were large, even within the upland broadleaf vegetation type. The range was larger when other vegetation types were considered. Unfortunately, cloud cover and navigation errors compromised the utility of the TIMS and AVIRIS data. Nevertheless, the AIRSAR data alone appear to have improved on the available vegetation map for the study area. An example from an area converted to a farm is given to demonstrate how the combined information from AIRSAR, TIMS, and AVIRIS can uniquely identify distinct classes of land use. The example alludes to the potential utility of the three instruments for identifying vegetation at an ecological scale finer than vegetation types.

  8. Mapping suitability of rice production systems for mitigation: Strategic approach for prioritizing improved irrigation management across scales

    NASA Astrophysics Data System (ADS)

    Wassmann, Reiner; Sander, Bjoern Ole

    2016-04-01

    After the successful conclusion of the COP21 in Paris, many developing countries are now embracing the task of reducing emissions with much vigor than previously. In many countries of South and South-East Asia, the agriculture sector constitutes a vast share of the national GHG budget which can mainly be attributed to methane emissions from flooded rice production. Thus, rice growing countries are now looking for tangible and easily accessible information as to how to reduce emissions from rice production in an efficient manner. Given present and future food demand, mitigation options will have to comply with aim of increasing productivity. At the same time, limited financial resources demand for strategic planning of potential mitigation projects based on cost-benefit ratios. At this point, the most promising approach for mitigating methane emissions from rice is an irrigation technique called Alternate Wetting and Drying (AWD). AWD was initially developed for saving water and subsequently, represents an adaptation strategy in its own right by coping with less rainfall. Moreover, AWD also reduces methane emissions in a range from 30-70%. However, AWD is not universally suitable. It is attractive to farmers who have to pump water and may save fuel under AWD, but renders limited incentives in situations where there is no real pressing water scarcity. Thus, planning for AWD adoption at larger scale, e.g. for country-wide programs, should be based on a systematic prioritization of target environments. This presentation encompasses a new methodology for mapping suitability of water-saving in rice production - as a means for planning adaptation and mitigation programs - alongside with preliminary results. The latter comprises three new GIS maps on climate-driven suitability of AWD in major rice growing countries (Philippines, Vietnam, Bangladesh). These maps have been derived from high-resolution data of the areal and temporal extent of rice production that are now

  9. Karst groundwater protection: First application of a Pan-European Approach to vulnerability, hazard and risk mapping in the Sierra de Líbar (Southern Spain).

    PubMed

    Andreo, Bartolomé; Goldscheider, Nico; Vadillo, Iñaki; Vías, Jesús María; Neukum, Christoph; Sinreich, Michael; Jiménez, Pablo; Brechenmacher, Julia; Carrasco, Francisco; Hötzl, Heinz; Perles, María Jesús; Zwahlen, François

    2006-03-15

    The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, validation of vulnerability maps, hazard and risk mapping. This paper presents the first application of all components of this Pan-European Approach to the Sierra de Líbar, a karst hydrogeology system in Andalusia, Spain. The intrinsic vulnerability maps take into account the hydrogeological characteristics of the area but are independent from specific contaminant properties. Two specific vulnerability maps were prepared for faecal coliforms and BTEX. These maps take into account the specific properties of these two groups of contaminants and their interaction with the karst hydrogeological system. The vulnerability assessment was validated by means of tracing tests, hydrological, hydrochemical and isotope methods. The hazard map shows the localization of potential contamination sources resulting from human activities, and evaluates those according to their dangerousness. The risk of groundwater contamination depends on the hazards and the vulnerability of the aquifer system. The risk map for the Sierra de Líbar was thus created by overlaying the hazard and vulnerability maps.

  10. Application of digital soil mapping in traditional soil survey - an approach used for the production of the national soil map of the United Arab

    NASA Astrophysics Data System (ADS)

    Abdelfattah, M. A.; Pain, C.

    2012-04-01

    Digital soil maps are essential part of the soil assessment framework which supports soil-related decisions and policy-making and therefore it is of crucial importance that they are of known quality. Digital soil mapping is perhaps the next great advancement in soil survey information. Traditional soil survey has always struggled with the collection of data. The amount of soil data and information required to justify the mapping product, how to interpolate date to similar areas, and how to incorporate older data are all challenges that need further exploration. The present study used digital soil mapping to develop a generalized national soil map of the United Arab Emirates with available recent traditional soil survey of Abu Dhabi Emirate (2006-2009) and Northern Emirates (2010-2012), together with limited data from Dubai Emirate, an important part of the country. The map was developed by joining, generalizing, and correlating the information contained in the Soil Survey of Abu Dhabi Emirate, the Soil map of Dubai with limited data, and the Soil Survey of the Northern Emirates. Because the soil surveys were completed at different times and with different standards and procedures, the original map lines and soil classifications had to be modified in order to integrate the three original maps and legends into this single national level map. The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) version 2 was used to guide line placement of the map units. It was especially helpful for the Torripsamments units which are separated based on local landscape relief characteristics. A generalized soil map of the United Arab Emirates is produced, which consists of fifteen map units, twelve are named for the soil great group that dominants each unit. Three are named "Rock outcrop", "Mountains", or "Miscellaneous units". Statistical details are also presented. Soil great groups are appropriate taxa to use for soil

  11. An integrated remote sensing approach for landslide susceptibly mapping at the volcanic islands of Vulcano and Lipari (Eolian Island, Italy)

    NASA Astrophysics Data System (ADS)

    Scifoni, Silvia; Palenzuela Baena, José A.; Marsella, Maria; Pepe, Susi; Sansosti, Eugenio; Solaro, Giuseppe; Tizzani, Piero

    2015-10-01

    Volcanic Island can be affected by instability phenomena such as landslide and partial collapse events, even in quiescent period. Starting from data collected by an aerial laser scanning survey at cm-level accuracy), a GIS based approach was implemented in order to perform a landslide-susceptibility analysis. The results of this analysis were compared and integrated with data derived from Differential Synthetic Aperture Radar Interferometry (DinSAR) analysis able to identify the most active areas and quantify the on-going deformation processes. The analysis is focused on the on the active volcanic edifice of Vulcano Island and in some areas of Lipari island, both include in the Eaolian Islands in Sicily (Italy). The developed approach represent a step-forward for the compilation of hazard maps furnishing in an overall contest, updated and georeferenced quantitative data, describing the morphology and the present behaviour of the slopes in the area of investigation.

  12. A multi-pronged approach for compiling a global map of allosteric regulation in the apoptotic caspases

    PubMed Central

    Dagbay, Kevin; Eron, Scott J.; Serrano, Banyuhay P.; Velázquez-Delgado, Elih M.; Zhao, Yunlong; Lin, Di; Vaidya, Sravanti; Hardy, Jeanne A.

    2014-01-01

    One of the most promising and as yet underutilized means of regulating protein function is exploitation of allosteric sites. All caspases catalyze the same overall reaction, but they perform different biological roles and are differentially regulated. It is our hypothesis that many allosteric sites exist on various caspases and that understanding both the distinct and overlapping mechanisms by which each caspase can be allosterically controlled should ultimately enable caspase-specific inhibition. Here we describe the ongoing work and methods for compiling a comprehensive map of apoptotic caspase allostery. Central to this approach are the use of i) the embedded record of naturally evolved allosterically sites that are sensitive to zinc-medicated inhibition, phosphorylation and other post-translationally modifications, ii) structural and mutagenic approaches and iii) novel binding sites identified by both rationally-designed and screening-derived small-molecule inhibitors. PMID:24974292

  13. Deriving pathway maps from automated text analysis using a grammar-based approach.

    PubMed

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases. PMID:16819797

  14. A NEW APPROACH TO CONSTRAIN BLACK HOLE SPINS IN ACTIVE GALAXIES USING OPTICAL REVERBERATION MAPPING

    SciTech Connect

    Wang, Jian-Min; Du, Pu; Li, Yan-Rong; Hu, Chen; Ho, Luis C.; Bai, Jin-Ming

    2014-09-01

    A tight relation between the size of the broad-line region (BLR) and optical luminosity has been established in about 50 active galactic nuclei studied through reverberation mapping of the broad Hβ emission line. The R {sub BLR}-L relation arises from simple photoionization considerations. Using a general relativistic model of an optically thick, geometrically thin accretion disk, we show that the ionizing luminosity jointly depends on black hole mass, accretion rate, and spin. The non-monotonic relation between the ionizing and optical luminosity gives rise to a complicated relation between the BLR size and the optical luminosity. We show that the reverberation lag of Hβ to the varying continuum depends very sensitively on black hole spin. For retrograde spins, the disk is so cold that there is a deficit of ionizing photons in the BLR, resulting in shrinkage of the hydrogen ionization front with increasing optical luminosity, and hence shortened Hβ lags. This effect is specially striking for luminous quasars undergoing retrograde accretion, manifesting in strong deviations from the canonical R {sub BLR}-L relation. This could lead to a method to estimate black hole spins of quasars and to study their cosmic evolution. At the same time, the small scatter of the observed R {sub BLR}-L relation for the current sample of reverberation-mapped active galaxies implies that the majority of these sources have rapidly spinning black holes.

  15. An approximate Bayesian approach for mapping paired-end DNA reads to a reference genome

    PubMed Central

    Shrestha, Anish Man Singh; Frith, Martin C.

    2013-01-01

    Summary: Many high-throughput sequencing experiments produce paired DNA reads. Paired-end DNA reads provide extra positional information that is useful in reliable mapping of short reads to a reference genome, as well as in downstream analyses of structural variations. Given the importance of paired-end alignments, it is surprising that there have been no previous publications focusing on this topic. In this article, we present a new probabilistic framework to predict the alignment of paired-end reads to a reference genome. Using both simulated and real data, we compare the performance of our method with six other read-mapping tools that provide a paired-end option. We show that our method provides a good combination of accuracy, error rate and computation time, especially in more challenging and practical cases, such as when the reference genome is incomplete or unavailable for the sample, or when there are large variations between the reference genome and the source of the reads. An open-source implementation of our method is available as part of Last, a multi-purpose alignment program freely available at http://last.cbrc.jp. Contact: martin@cbrc.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23413433

  16. Charting a course to competency: an approach to mapping public health core competencies to existing trainings.

    PubMed

    Neiworth, Latrissa L; Allan, Susan; D'Ambrosio, Luann; Coplen-Abrahamson, Marlene

    2014-03-01

    Consistent with other professional fields, the goals of public health training have moved from a focus on knowledge transfer to the development of skills or competencies. At least six national competency sets have been developed in the past decade pertaining to public health professionals. State and local public health agencies are increasingly using competency sets as frameworks for staff development and assessment. Mapping competencies to training has potential for enhancing the value of public health training during resource-constrained times by directly linking training content to the desired skills. For existing public health trainings, the challenge is how to identify competencies addressed in those courses in a manner that is not burdensome and that produces valid results. This article describes a process for mapping competencies to the learning objectives, assignments, and assessments of existing trainings. The process presented could be used by any training center or organization that seeks to connect public health workforce competencies to previously developed instruction. Public health practice can be strengthened more effectively if trainings can be selected for the desired practice skills or competencies.

  17. Microscopic formulation of the interacting boson model for rotational nuclei

    SciTech Connect

    Nomura, Kosuke; Shimizu, Noritaka; Otsuka, Takaharu; Guo, Lu

    2011-04-15

    We propose a novel formulation of the interacting boson model (IBM) for rotational nuclei with axially symmetric, strong deformation. The intrinsic structure represented by the potential-energy surface (PES) of a given multinucleon system has a certain similarity to that of the corresponding multiboson system. Based on this feature, one can derive an appropriate boson Hamiltonian, as already reported. This prescription, however, has a major difficulty in the rotational spectra of strongly deformed nuclei: the bosonic moment of inertia is significantly smaller than the corresponding nucleonic one. We present that this difficulty originates in the difference between the rotational response of a nucleon system and that of the corresponding boson system, and could arise even if the PESs of the two systems were identical. We further suggest that the problem can be solved by implementing the L{center_dot}L term into the IBM Hamiltonian, with the coupling constant derived from the cranking approach of Skyrme mean-field models. The validity of the method is confirmed for rare-earth and actinoid nuclei, as their experimental rotational yrast bands are reproduced nicely.

  18. Trilinear neutral gauge boson couplings in effective theories

    NASA Astrophysics Data System (ADS)

    Larios, F.; Pérez, M. A.; Tavares-Velasco, G.; Toscano, J. J.

    2001-06-01

    We list all the lowest dimension effective operators inducing off-shell trilinear neutral gauge boson couplings ZZγ, Zγγ, and ZZZ within the effective Lagrangian approach, both in the linear and nonlinear realizations of SU(2)L × U(1)Y gauge symmetry. In the linear scenario we find that these couplings can be generated only by dimension-8 operators necessarily including the Higgs boson field, whereas in the nonlinear case they are induced by dimension-6 operators. We consider the impact of these couplings on some precision measurements such as the magnetic and electric dipole moments of fermions, as well as the Z boson rare decay Z-->νν¯γ. If the underlying new physics is of a decoupling nature, it is not expected that trilinear neutral gauge boson couplings may affect considerably any of these observables. On the contrary, it is just in the nonlinear scenario where these couplings have the more promising prospects of being perceptible through high precision experiments.

  19. Landslide susceptibility mapping of vicinity of Yaka Landslide (Gelendost, Turkey) using conditional probability approach in GIS

    NASA Astrophysics Data System (ADS)

    Ozdemir, Adnan

    2009-06-01

    On 19 February 2007, a landslide occurred on the Alaardıç Slope, located 1.6 km south of the town of Yaka (Gelendost, Turkey.) Subsequently, the displaced materials transformed into a mud flow in Eğlence Creek and continued 750 m downstream towards the town of Yaka. The mass poised for motion in the Yaka Landslide source area and its vicinity, which would be triggered to a kinetic state by trigger factors such as heavy or sustained rainfall and/or snowmelt, poise a danger in the form of loss of life and property to Yaka with its population of 3,000. This study was undertaken to construct a susceptibility mapping of the vicinity of the Yaka Landslide’s source area and to relate it to movement of the landslide mass with the goal of prevention or mitigation of loss of life and property. The landslide susceptibility map was formulated by designating the relationship of the effecting factors that cause landslides such as lithology, gradient, slope aspect, elevation, topographical moisture index, and stream power index to the landslide map, as determined by analysis of the terrain, through the implementation of the conditional probability method. It was determined that the surface area of the Goksogut formation, which has attained lithological characteristics of clayey limestone with a broken and separated base and where area landslides occur, possesses an elevation of 1,100-1,300 m, a slope gradient of 15°-35° and a slope aspect between 0°-67.5° and 157°-247°. Loss of life and property may be avoided by the construction of structures to check the debris mass in Eğlence Creek, the cleaning of the canal which passes through Yaka, the broadening of the canal’s base area, elevating the protective edges along the canal and the establishment of a protective zone at least 10-m wide on each side of the canal to deter against damage from probable landslide occurrence and mud flow.

  20. Analyzing the impact of social factors on homelessness: a Fuzzy Cognitive Map approach

    PubMed Central

    2013-01-01

    Background The forces which affect homelessness are complex and often interactive in nature. Social forces such as addictions, family breakdown, and mental illness are compounded by structural forces such as lack of available low-cost housing, poor economic conditions, and insufficient mental health services. Together these factors impact levels of homelessness through their dynamic relations. Historic models, which are static in nature, have only been marginally successful in capturing these relationships. Methods Fuzzy Logic (FL) and fuzzy cognitive maps (FCMs) are particularly suited to the modeling of complex social problems, such as homelessness, due to their inherent ability to model intricate, interactive systems often described in vague conceptual terms and then organize them into a specific, concrete form (i.e., the FCM) which can be readily understood by social scientists and others. Using FL we converted information, taken from recently published, peer reviewed articles, for a select group of factors related to homelessness and then calculated the strength of influence (weights) for pairs of factors. We then used these weighted relationships in a FCM to test the effects of increasing or decreasing individual or groups of factors. Results of these trials were explainable according to current empirical knowledge related to homelessness. Results Prior graphic maps of homelessness have been of limited use due to the dynamic nature of the concepts related to homelessness. The FCM technique captures greater degrees of dynamism and complexity than static models, allowing relevant concepts to be manipulated and interacted. This, in turn, allows for a much more realistic picture of homelessness. Through network analysis of the FCM we determined that Education exerts the greatest force in the model and hence impacts the dynamism and complexity of a social problem such as homelessness. Conclusions The FCM built to model the complex social system of homelessness

  1. Spatial entanglement of bosons in optical lattices.

    PubMed

    Cramer, M; Bernard, A; Fabbri, N; Fallani, L; Fort, C; Rosi, S; Caruso, F; Inguscio, M; Plenio, M B

    2013-01-01

    Entanglement is a fundamental resource for quantum information processing, occurring naturally in many-body systems at low temperatures. The presence of entanglement and, in particular, its scaling with the size of system partitions underlies the complexity of quantum many-body states. The quantitative estimation of entanglement in many-body systems represents a major challenge, as it requires either full-state tomography, scaling exponentially in the system size, or the assumption of unverified system characteristics such as its Hamiltonian or temperature. Here we adopt recently developed approaches for the determination of rigorous lower entanglement bounds from readily accessible measurements and apply them in an experiment of ultracold interacting bosons in optical lattices of ~10(5) sites. We then study the behaviour of spatial entanglement between the sites when crossing the superfluid-Mott insulator transition and when varying temperature. This constitutes the first rigorous experimental large-scale entanglement quantification in a scalable quantum simulator.

  2. Biomass Cost Index: mapping biomass-to-biohydrogen feedstock costs by a new approach.

    PubMed

    Diamantopoulou, L K; Karaoglanoglou, L S; Koukios, E G

    2011-02-01

    Making decisions and developing policy in the field of biofuel and bioenergy is complex because of the large number and potential arrangements of feedstocks, technologies and supply chain options. Although, the technical optimisation and sustainability of any biomass to biofuel production chain is of major importance, the overall chain cost is still considered as the key for their market deployment. A significant percentage of this cost is attributed to primary generation, transportation/handling and pretreatment of the biomass. The separation of the system into smaller semi-independent sub-systems and dealing with their interfaces, provides the pathway to map this complex landscape. The main scope of this work is to present a tool, which was developed for the comparison of diverse biomass-to-biofuel systems, in order to facilitate the cost-wise decision making on this field.

  3. Lagrangian Mapping Approach to Generate Intermittency and its Application in Plasma Turbulence

    NASA Astrophysics Data System (ADS)

    Subedi, P.; Matthaeus, W. H.; Tessein, J.; Chhiber, R.; Wan, M.

    2014-12-01

    The Minimal Lagrangian Mapping procedure developed in the context of neutral fluid turbulence(Rosales and Meneveau 2006) is a simple method to generate synthetic vector fields. Using a sequenceof low pass filtered fields, fluid particles are displaced at their rms-speed for some scale-dependenttime interval, and then interpolated back to a regular grid. Fields produced in this way are seen topossess certain properties of real turbulence. We extend the technique to plasmas by takinginto account the coupling between the velocity and magnetic fields. We examine several possibleapplications to plasma systems. One use is as initial conditions for simulations, wherein these syntheticfields may efficiently produce a strongly intermittent cascade. The intermittency properties of thesynthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transportand modulation in the test particle approximation may benefit from improved realism in syntheticfields produced in this way.

  4. Generating synthetic magnetic field intermittency using a Minimal Multiscale Lagrangian Mapping approach

    SciTech Connect

    Subedi, P.; Chhiber, R.; Tessein, J. A.; Wan, M.; Matthaeus, W. H.

    2014-12-01

    The Minimal Multiscale Lagrangian Mapping procedure developed in the context of neutral fluid turbulence is a simple method for generating synthetic vector fields. Using a sequence of low-pass filtered fields, fluid particles are displaced at their rms speed for some scale-dependent time interval, and then interpolated back to a regular grid. Fields produced in this way are seen to possess certain properties of real turbulence. This paper extends the technique to plasmas by taking into account the coupling between the velocity and magnetic fields. We examine several possible applications to plasma systems. One use is as initial conditions for simulations, wherein these synthetic fields may efficiently produce a strongly intermittent cascade. The intermittency properties of the synthetic fields are also compared with those of the solar wind. Finally, studies of cosmic ray transport and modulation in the test particle approximation may benefit from improved realism in synthetic fields produced in this way.

  5. Bayesian Maximum Entropy Approach to Mapping Soil Moisture at the Field Scale

    NASA Astrophysics Data System (ADS)

    Dong, J.; Ochsner, T.; Cosh, M. H.

    2012-12-01

    The study of soil moisture spatial variability at the field scale is important to aid in modeling hydrological processes at the land surface. The Bayesian Maximum Entropy (BME) framework is a more general method than classical geostatistics and has not yet been applied to soil moisture spatial estimation. This research compares the effectiveness of BME versus kriging estimators for spatial prediction of soil moisture at the field scale. Surface soil moisture surveys were conducted in a 227 ha pasture at the Marena, Oklahoma In Situ Sensor Testbed (MOISST) site. Remotely sensed vegetation data will be incorporated into the soil moisture spatial prediction using the BME method. Soil moisture maps based on the BME and traditional kriging frameworks will be cross-validated and compared.

  6. Chain mapping approach of Hamiltonian for FMO complex using associated, generalized and exceptional Jacobi polynomials

    NASA Astrophysics Data System (ADS)

    Mahdian, M.; Arjmandi, M. B.; Marahem, F.

    2016-06-01

    The excitation energy transfer (EET) in photosynthesis complex has been widely investigated in recent years. However, one of the main problems is simulation of this complex under realistic condition. In this paper by using the associated, generalized and exceptional Jacobi polynomials, firstly, we introduce the spectral density of Fenna-Matthews-Olson (FMO) complex. Afterward, we obtain a map that transforms the Hamiltonian of FMO complex as an open quantum system to a one-dimensional chain of oscillatory modes with only nearest neighbor interaction in which the system is coupled only to first mode of chain. The frequency and coupling strength of each mode can be analytically obtained from recurrence coefficient of mentioned orthogonal polynomials.

  7. A new approach to map transcription sites at the ultrastructural level.

    PubMed

    Testillano, P S; Gorab, E; Risueño, M C

    1994-01-01

    We describe a new ultrastructural method for locating transcription on ultra-thin sections. The use of anti-DNA/RNA hybrid antibodies provides specific labeling on precise structures of the nuclear compartments of several cell types. All mammalian and plant material studied (HeLa cells, lymphocytes, onion root meristematic cells) showed the same pattern of labeling: fibrillar structures in the interchromatin region and discrete regions of the dense fibrillar component at the periphery of the fibrillar centers in the nucleolus. The specificity of the immunogold labeling was tested by RNAse H digestion and by pre-blocking the antibody with synthetic DNA/RNA hybrids; in both cases no gold particles were observed. This method has considerable advantages compared with current techniques, constituting a very useful tool to map transcriptionally active loci in a variety of cells. PMID:7505298

  8. Biomass Cost Index: mapping biomass-to-biohydrogen feedstock costs by a new approach.

    PubMed

    Diamantopoulou, L K; Karaoglanoglou, L S; Koukios, E G

    2011-02-01

    Making decisions and developing policy in the field of biofuel and bioenergy is complex because of the large number and potential arrangements of feedstocks, technologies and supply chain options. Although, the technical optimisation and sustainability of any biomass to biofuel production chain is of major importance, the overall chain cost is still considered as the key for their market deployment. A significant percentage of this cost is attributed to primary generation, transportation/handling and pretreatment of the biomass. The separation of the system into smaller semi-independent sub-systems and dealing with their interfaces, provides the pathway to map this complex landscape. The main scope of this work is to present a tool, which was developed for the comparison of diverse biomass-to-biofuel systems, in order to facilitate the cost-wise decision making on this field. PMID:21074419

  9. 3D models mapping optimization through an integrated parameterization approach: cases studies from Ravenna

    NASA Astrophysics Data System (ADS)

    Cipriani, L.; Fantini, F.; Bertacchi, S.

    2014-06-01

    Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.

  10. Mapping permeability in low-resolution micro-CT images: A multiscale statistical approach

    NASA Astrophysics Data System (ADS)

    Botha, Pieter W. S. K.; Sheppard, Adrian P.

    2016-06-01

    We investigate the possibility of predicting permeability in low-resolution X-ray microcomputed tomography (µCT). Lower-resolution whole core images give greater sample coverage and are therefore more representative of heterogeneous systems; however, the lower resolution causes connecting pore throats to be represented by intermediate gray scale values and limits information on pore system geometry, rendering such images inadequate for direct permeability simulation. We present an imaging and computation workflow aimed at predicting absolute permeability for sample volumes that are too large to allow direct computation. The workflow involves computing permeability from high-resolution µCT images, along with a series of rock characteristics (notably open pore fraction, pore size, and formation factor) from spatially registered low-resolution images. Multiple linear regression models correlating permeability to rock characteristics provide a means of predicting and mapping permeability variations in larger scale low-resolution images. Results show excellent agreement between permeability predictions made from 16 and 64 µm/voxel images of 25 mm diameter 80 mm tall core samples of heterogeneous sandstone for which 5 µm/voxel resolution is required to compute permeability directly. The statistical model used at the lowest resolution of 64 µm/voxel (similar to typical whole core image resolutions) includes open pore fraction and formation factor as predictor characteristics. Although binarized images at this resolution do not completely capture the pore system, we infer that these characteristics implicitly contain information about the critical fluid flow pathways. Three-dimensional permeability mapping in larger-scale lower resolution images by means of statistical predictions provides input data for subsequent permeability upscaling and the computation of effective permeability at the core scale.

  11. A robotic approach to mapping post-eruptive volcanic fissure conduits

    NASA Astrophysics Data System (ADS)

    Parcheta, Carolyn E.; Pavlov, Catherine A.; Wiltsie, Nicholas; Carpenter, Kalind C.; Nash, Jeremy; Parness, Aaron; Mitchell, Karl L.

    2016-06-01

    VolcanoBot was developed to map volcanic vents and their underlying conduit systems, which are rarely preserved and generally inaccessible to human exploration. It uses a PrimeSense Carmine 1.09 sensor for mapping and carries an IR temperature sensor, analog distance sensor, and an inertial measurement unit (IMU) inside a protective shell. The first field test succeeded in collecting valuable scientific data but revealed several needed improvements, including more rugged cable connections and mechanical couplers, increased ground clearance, and higher-torque motors for uphill mobility. The second field test significantly improved on all of these aspects but it traded electrical ruggedness for reduced data collection speed. Data collected by the VolcanoBots, while intermittent, yield the first insights into the cm-scale geometry of volcanic fissures at depths of up to 25 m. VolcanoBot was deployed at the 1969 Mauna Ulu fissure system on Kīlauea volcano in Hawai'i. It collected first-of-its-kind data from inside the fissure system. We hypothesized that 1) fissure sinuosity should decrease with depth, 2) irregularity should be persistent with depth, 3) any blockages in the conduit should occur at the narrowest points, and 4) the fissure should narrow with depth until it is too narrow for VolcanoBot to pass or is plugged with solidified lava. Our field campaigns did not span enough lateral or vertical area to test sinuosity. The preliminary data indicate that 1) there were many irregularities along fissures at depth, 2) blockages occurred, but not at obviously narrow locations, and 3) the conduit width remained a consistent 0.4-0.5 m for most of the upper 10 m that we analyzed.

  12. Measurements of trilinear gauge boson couplings

    SciTech Connect

    Abbott, B.

    1997-10-01

    Direct measurements of the trilinear gauge boson couplings by the D0 collaboration at Fermilab are reported. Limits on the anomalous couplings were obtained at a 95% CL from four diboson production processes: W{gamma} production with the W boson decaying to e{nu} or {mu}{nu}, WW production with both of the W bosons decaying to e{nu} or {mu}{nu}, WW/WZ production with one W boson decaying to e{nu} and the other W or Z boson decaying to two jets, and Z{gamma} production with the Z boson decaying to ee, {mu}{mu}, or {nu}{nu}. Limits were also obtained from a combined fit to W{gamma}, WW {yields} dileptons and WW/WZ {yields} e{nu}jj data samples.

  13. An Introduction to Boson-Sampling

    NASA Astrophysics Data System (ADS)

    Gard, Bryan T.; Motes, Keith R.; Olson, Jonathan P.; Rohde, Peter P.; Dowling, Jonathan P.

    2015-06-01

    Boson-sampling is a simplified model for quantum computing that may hold the key to implementing the first ever post-classical quantum computer. Boson-sampling is a non-universal quantum computer that is significantly more straightforward to build than any universal quantum computer proposed so far. We begin this chapter by motivating boson-sampling and discussing the history of linear optics quantum computing. We then summarize the boson-sampling formalism, discuss what a sampling problem is, explain why boson-sampling is easier than linear optics quantum computing, and discuss the Extended Church-Turing thesis. Next, sampling with other classes of quantum optical states is analyzed. Finally, we discuss the feasibility of building a boson-sampling device using existing technology.

  14. Diffractive Higgs boson photoproduction in {gamma}p process

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2008-12-01

    We explore an alternative process for diffractive Higgs boson production in peripheral pp collisions arising from double Pomeron exchange in photon-proton interaction. We introduce the impact factor formalism in order to enable the gluon ladder exchange in the photon-proton subprocess, and to permit central Higgs production. The event rate for diffractive Higgs production in central rapidity is estimated to be about 0.6 pb at Tevatron and LHC energies. This result is higher than predictions from other approaches of diffractive Higgs production, showing that the alternative production process leads to an enhanced signal for the detection of the Higgs boson at hadron colliders. Our results are compared with those obtained from a similar approach proposed by the Durham Group. In this way, we may examine future developments in its application to pp and AA collisions.

  15. An automatic approach for rice mapping in temperate region using time series of MODIS imagery: first results for Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Boschetti, M.; Nelson, A.; Manfrom, G.; Brivio, P. A.

    2012-04-01

    Timely and accurate information on crop typology and status are required to support suitable action to better manage agriculture production and reduce food insecurity. More specifically, regional crop masking and phenological information are important inputs for spatialized crop growth models for yield forecasting systems. Digital cartographic data available at global/regional scale, such as GLC2000, GLOBCOVER or MODIS land cover products (MOD12), are often not adequate for this crop modeling application. For this reason, there is a need to develop and test methods that can provide such information for specific cropsusing automated classification techniques.. In this framework we focused our analysis on the rice cultivation area detection due to the importance of this crop. Rice is a staple food for half of the world's population (FAO 2004). Over 90% of the world's rice is produced and consumed in Asia and the region is home to 70% of the world's poor, most of whom depend on rice for their livelihoods andor food security. Several initiatives are being promoted at the international level to provide maps of rice cultivated areas in South and South East Asia using different approaches available in literature for rice mapping in tropical regions. We contribute to these efforts by proposing an automatic method to detect rice cultivated areas in temperate regions exploiting MODIS 8-Day composite of Surface Reflectance at 500m spatial resolution (MOD09A1product). Temperate rice is cultivated worldwide in more than 20 countries covering around 16M ha for a total production of about 65M tons of paddy per year. The proposed method is based on a common approach available in literature that first identifies flood condition that can be related to rice agronomic practice and then checks for vegetation growth. The method presents innovative aspects related both to the flood detection, exploiting Short Wave Infrared spectral information, and to the crop grow monitoring analyzing

  16. LANDSCAPE ECOLOGY APPROACHES FOR DETECTING, MAPPING, AND ASSESSING THE VULNERABILITY OF DEPRESSIONAL WETLANDS

    EPA Science Inventory

    U.S. EPA is using a landscape ecology approach to assess the ecological/hydrologic functions and related human values of depressional wetlands along coastal Texas, considered to be vulnerable to human disturbance. Many of those wetlands may be at high risk because of recent court...

  17. Geospatial Approach to Regional Mapping of Research Library Holdings: Use of Arcinfo at IRANDOC

    ERIC Educational Resources Information Center

    Sedighi, Mehri-e-

    2007-01-01

    Purpose: The purpose of this paper is to provide a report on the application of a Geographic Information System (GIS), ArcInfo, in the cataloguing of geosciences documents held by IRANDOC. Design/methodology/approach: The steps involved in the application are described: gathering the data and required input including the attribute and spatial…

  18. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover.

  19. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover. PMID:23255169

  20. Integer quantum Hall effect for bosons.

    PubMed

    Senthil, T; Levin, Michael

    2013-01-25

    A simple physical realization of an integer quantum Hall state of interacting two dimensional bosons is provided. This is an example of a symmetry-protected topological (SPT) phase which is a generalization of the concept of topological insulators to systems of interacting bosons or fermions. Universal physical properties of the boson integer quantum Hall state are described and shown to correspond with those expected from general classifications of SPT phases.

  1. A Search for Dark Higgs Bosons

    SciTech Connect

    Lees, J.P.

    2012-06-08

    Recent astrophysical and terrestrial experiments have motivated the proposal of a dark sector with GeV-scale gauge boson force carriers and new Higgs bosons. We present a search for a dark Higgs boson using 516 fb{sup -1} of data collected with the BABAR detector. We do not observe a significant signal and we set 90% confidence level upper limits on the product of the Standard Model-dark sector mixing angle and the dark sector coupling constant.

  2. Probing anomalous gauge boson couplings at LEP

    SciTech Connect

    Dawson, S.; Valencia, G.

    1994-12-31

    We bound anomalous gauge boson couplings using LEP data for the Z {yields} {bar {integral}}{integral} partial widths. We use an effective field theory formalism to compute the one-loop corrections resulting from non-standard model three and four gauge boson vertices. We find that measurements at LEP constrain the three gauge boson couplings at a level comparable to that obtainable at LEPII.

  3. The Higgs Boson for the Masses?

    SciTech Connect

    Quigg, Chris

    2012-04-04

    The Higgs boson is the object of one of the greatest campaigns in the history of particle physics and a pop-culture icon. But what is a Higgs boson, and what would we like it to do for us? What will we understand after a discovery that we don't understand before? How would the world be different if nothing did the job of the Higgs boson? We will explore all these questions and more through demonstration, simulation, and audience participation.

  4. Fermionic Subspaces of the Bosonic String

    NASA Astrophysics Data System (ADS)

    Chattaraputi, A.; Englert, F.; Houart, L.; Taormina, A.

    A universal symmetric truncation of the bosonic string Hilbert space yields all known closed fermionic string theories in ten dimensions, their D-branes and their open descendants. We highlight the crucial role played by group theory and two-dimensional conformal field theory in the construction and emphasize the predictive power of the truncation. Such circumstantial evidence points towards the existence of a mechanism which generates space-time fermions out of bosons dynamically within the framework of bosonic string theory.

  5. Fermionic subspaces of the bosonic string

    NASA Astrophysics Data System (ADS)

    Chattaraputi, Auttakit; Englert, François; Houart, Laurent; Taormina, Anne

    2003-06-01

    A universal symmetric truncation of the bosonic string Hilbert space yields all known closed fermionic string theories in ten dimensions, their D-branes and their open descendants. We highlight the crucial role played by group theory and two-dimensional conformal field theory in the construction and emphasize the predictive power of the truncation. Such circumstantial evidence points towards the existence of a mechanism which generates spacetime fermions out of bosons dynamically within the framework of bosonic string theory.

  6. Fat Jets for a Light Higgs Boson

    SciTech Connect

    Plehn, Tilman; Salam, Gavin P.; Spannowsky, Michael

    2010-03-19

    At the LHC associated top quark and Higgs boson production with a Higgs boson decay to bottom quarks has long been a heavily disputed search channel. Recently, it has been found not to be viable. We show how it can be observed by tagging massive Higgs bosons and top jets. For this purpose we construct boosted top and Higgs taggers for standard-model processes in a complex QCD environment.

  7. An indicators' based approach to Drought and Water Scarcity Risk Mapping in Pinios River Basin, Greece.

    NASA Astrophysics Data System (ADS)

    Kossida, Maggie; Mimikou, Maria

    2013-04-01

    Assessing the vulnerability and the associated risk to water scarcity and drought is a complex multi-factor problem. The underlying exposure to climatic stresses may be similar even in quite different conditions, yet the vulnerability and prevailing risk are a function of the socio-economic state, the current policy and institutional setting, the adaptive capacity of the affected area and population, and the response strategies adopted (Kossida et al., 2012). Although flood risk assessment has been elaborated under the EU Floods Directive, there is currently a lack of analytical frameworks for the definition and assessment of drought and water scarcity related risk at European level. This can partially be attributed to the inherent complexity of such phenomena which lie at the crossroads between physical and anthropogenic drivers and pressures, operating on many scales, and with a variety of impacts on many sectors. The quantification of the various components of drought and water scarcity risk is challenging since data present limitations, relevant indicators that can represent or proxy the various components are still not clearly defined, while their relevant weights need to be determined in view of the prevailing regional conditions. The current study in Pinios River Basin, an area highly impacted by drought and water scarcity, proposes a methodology for drought and water scarcity risk assessment using blended indicators. Using the Standard Precipitation Index (SPI) as a base drought indicator, relevant sub-indicators reflecting the magnitude, severity, duration and recurrence of drought events from 1980-2011 have been produced. These sub-indicators have been assigned relevant scores and have been blended into a Drought Vulnerability Index (DVI) using different weights derived from an analytical hierarchy process (AHP). The resulting map of DVI has been then blended with additional socio-economic indicators of surface and groundwater exploitation, water deficit

  8. Higgs Boson Signatures of MSSM Electroweak Baryogenesis

    SciTech Connect

    Menon, Arjun; Morrissey, David

    2010-02-10

    Electroweak baryogenesis in the MSSM can account for the cosmological baryon asymmetry, but only with a very light scalar top and a SM-like Higgs boson. We investigate the effects of this light scalar top on Higgs boson production and decay. Relative to the standard model Higgs boson, we find a large enhancement of the Higgs production rate through gluon fusion and a suppression of the Higgs branching fraction into photon pairs. These modifications in the properties of the Higgs boson are large enough that they can potentially be tested at the Tevatron and the LHC.

  9. Physical activity, physical fitness and academic achievement in adolescents: a self-organizing maps approach.

    PubMed

    Pellicer-Chenoll, Maite; Garcia-Massó, Xavier; Morales, Jose; Serra-Añó, Pilar; Solana-Tramunt, Mònica; González, Luis-Millán; Toca-Herrera, José-Luis

    2015-06-01

    The relationship among physical activity, physical fitness and academic achievement in adolescents has been widely studied; however, controversy concerning this topic persists. The methods used thus far to analyse the relationship between these variables have included mostly traditional lineal analysis according to the available literature. The aim of this study was to perform a visual analysis of this relationship with self-organizing maps and to monitor the subject's evolution during the 4 years of secondary school. Four hundred and forty-four students participated in the study. The physical activity and physical fitness of the participants were measured, and the participants' grade point averages were obtained from the five participant institutions. Four main clusters representing two primary student profiles with few differences between boys and girls were observed. The clustering demonstrated that students with higher energy expenditure and better physical fitness exhibited lower body mass index (BMI) and higher academic performance, whereas those adolescents with lower energy expenditure exhibited worse physical fitness, higher BMI and lower academic performance. With respect to the evolution of the students during the 4 years, ∼25% of the students originally clustered in a negative profile moved to a positive profile, and there was no movement in the opposite direction. PMID:25953972

  10. A hybrid model for mapping simplified seismic response via a GIS-metamodel approach

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Revellino, P.; Guerriero, L.; Guadagno, F. M.

    2014-02-01

    An hybrid model, consisting of GIS and metamodel (model of model) procedures, was introduced with the aim of estimating the 1-D spatial seismic site response. Inputs and outputs are provided and processed by means of an appropriate GIS model, named GIS Cubic Model (GCM). This discretizes the seismic underground half-space in a pseudo-tridimensional way. GCM consists of a layered parametric structure aimed at resolving a predicted metamodel by means of pixel to pixel vertical computing. The metamodel leading to the determination of a bilinear-polynomial function is able to design the classic shape of the spectral acceleration response in relation to the main physical parameters that characterize the spectrum itself. The main physical parameters consist of (i) the average shear wave velocity of the shallow layer, (ii) the fundamental period and, (iii) the period where the spatial spectral response is required. The metamodel is calibrated on theoretical spectral accelerations regarding the local likely Vs-profiles, which are obtained using the Monte Carlo simulation technique on the basis of the GCM information. Therefore, via the GCM structure and the metamodel, the hybrid model provides maps of normalized acceleration response spectra. The hybrid model was applied and tested on the built-up area of the San Giorgio del Sannio village, located in a high-risk seismic zone of Southern Italy.

  11. High-throughput approaches to understanding gene function and mapping network architecture in bacteria.

    PubMed

    Brochado, Ana Rita; Typas, Athanasios

    2013-04-01

    Advances in sequencing technology have provided an unprecedented view of bacterial diversity, along with a daunting number of novel genes. Within this new reality lies the challenge of developing large-scale approaches to assign function to the new genes and place them in pathways. Here, we highlight recent advances on this front, focusing on how high-throughput gene-gene, gene-drug and drug-drug interactions can yield functional and mechanistic inferences in bacteria. PMID:23403119

  12. A tetrahedron-based endmember selection approach for urban impervious surface mapping.

    PubMed

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA.

  13. A tetrahedron-based endmember selection approach for urban impervious surface mapping.

    PubMed

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA. PMID:24892938

  14. A Tetrahedron-Based Endmember Selection Approach for Urban Impervious Surface Mapping

    PubMed Central

    Wang, Wei; Yao, Xinfeng; Zhai, Junpeng; Ji, Minhe

    2014-01-01

    The pixel purity index (PPI) and two-dimensional (2-D) scatter plots are two popular techniques for endmember extraction in remote sensing spectral mixture analysis, yet both suffer from one major drawback, that is, the selection of a final set of endmembers has to endure a cumbersome process of iterative visual inspection and human intervention, especially when a spectrally-complex urban scene is involved. Within the conceptual framework of a V-H-L-S (vegetation-high albedo-low albedo-soil) model, which is expanded from the classic V-I-S (vegetation-impervious surface-soil) model, a tetrahedron-based endmember selection approach combined with a multi-objective optimization genetic algorithm (MOGA) was designed to identify urban endmembers from multispectral imagery. The tetrahedron defining the enclosing volume of MNF-transformed pixels in a three-dimensional (3-D) space was algorithmically sought, so that the tetrahedral vertices can ideally match the four components of the adopted model. A case study with Landsat Enhanced Thematic Mapper Plus (ETM+) satellite imagery in Shanghai, China was conducted to verify the validity of the method. The method performance was compared with those of the traditional PPI and 2-D scatter plots approaches. The results indicated that the tetrahedron-based endmember selection approach performed better in both accuracy and ease of identification for urban surface endmembers owing to the 3-D visualization analysis and use of the MOGA. PMID:24892938

  15. Nodal predictive error model and Bayesian approach for thermal diffusivity and heat source mapping

    NASA Astrophysics Data System (ADS)

    Massard, H.; Fudym, Olivier; Orlande, H. R. B.; Batsale, J. C.

    2010-07-01

    This article aims at solving a two-dimensional inverse heat conduction problem in order to retrieve both the thermal diffusivity and heat source field in a thin plate. A spatial random heat pulse is applied to the plate and the thermal response is analysed. The inverse approach is based on the minimisation of a nodal predictive error model, which yields a linear estimation problem. As a result of this approach, the sensitivity matrix is directly filled with experimental data, and thus is partially noisy. Bayesian estimators, such as the Maximum A Posteriori and a Markov Chain Monte Carlo approach (Metropolis-Hastings), are implemented and compared with the Ordinary Least Squares solution. Simulated temperature measurements are used in the inverse analysis. The nodal strategy relies on the availability of temperature measurements with fine spatial resolution and high frequency, typical of nowadays infrared cameras. The effects of both the measurement errors and of the model errors on the inverse problem solution are also analysed.

  16. A Robust Approach for the Background Subtraction Based on Multi-Layered Self-Organizing Maps.

    PubMed

    Gemignani, Giorgio; Rozza, Alessandro

    2016-11-01

    Motion detection in video streams is a challenging task for several computer vision applications. Indeed, segmentation of moving and static elements in the scene allows to increase the efficiency of several challenging tasks, such as human-computer interface, robot visions, and intelligent surveillance systems. In this paper, we approach motion detection through a multi-layered artificial neural network, which is able to build for each background pixel a multi-modal color distribution evolving over time through self-organization. According to the winner-take-all rule, each layer of the network models an independent state of the background scene, in response to external disturbing conditions, such as illumination variations, moving backgrounds, and jittering. As a result, our background subtraction method exhibits high generalization capabilities that in combination with a post-processing filtering schema allow to produce accurate motion segmentation. Moreover, we propose an approach to detect anomalous events (such as camera motion) that require background model re-initialization. We describe our method in full details and we compare it against the most recent background subtraction approaches. Experimental results for video sequences from the 2012 and 2014 CVPR Change Detection data sets demonstrate how our methodology outperforms many state-of-the-art methods in terms of detection rate. PMID:27608458

  17. A Robust Approach for the Background Subtraction Based on Multi-Layered Self-Organizing Maps.

    PubMed

    Gemignani, Giorgio; Rozza, Alessandro

    2016-11-01

    Motion detection in video streams is a challenging task for several computer vision applications. Indeed, segmentation of moving and static elements in the scene allows to increase the efficiency of several challenging tasks, such as human-computer interface, robot visions, and intelligent surveillance systems. In this paper, we approach motion detection through a multi-layered artificial neural network, which is able to build for each background pixel a multi-modal color distribution evolving over time through self-organization. According to the winner-take-all rule, each layer of the network models an independent state of the background scene, in response to external disturbing conditions, such as illumination variations, moving backgrounds, and jittering. As a result, our background subtraction method exhibits high generalization capabilities that in combination with a post-processing filtering schema allow to produce accurate motion segmentation. Moreover, we propose an approach to detect anomalous events (such as camera motion) that require background model re-initialization. We describe our method in full details and we compare it against the most recent background subtraction approaches. Experimental results for video sequences from the 2012 and 2014 CVPR Change Detection data sets demonstrate how our methodology outperforms many state-of-the-art methods in terms of detection rate.

  18. Exploring links between juvenile offenders and social disorganization at a large map scale: a Bayesian spatial modeling approach

    NASA Astrophysics Data System (ADS)

    Law, Jane; Quick, Matthew

    2013-01-01

    This paper adopts a Bayesian spatial modeling approach to investigate the distribution of young offender residences in York Region, Southern Ontario, Canada, at the census dissemination area level. Few geographic researches have analyzed offender (as opposed to offense) data at a large map scale (i.e., using a relatively small areal unit of analysis) to minimize aggregation effects. Providing context is the social disorganization theory, which hypothesizes that areas with economic deprivation, high population turnover, and high ethnic heterogeneity exhibit social disorganization and are expected to facilitate higher instances of young offenders. Non-spatial and spatial Poisson models indicate that spatial methods are superior to non-spatial models with respect to model fit and that index of ethnic heterogeneity, residential mobility (1 year moving rate), and percentage of residents receiving government transfer payments are, respectively, the most significant explanatory variables related to young offender location. These findings provide overwhelming support for social disorganization theory as it applies to offender location in York Region, Ontario. Targeting areas where prevalence of young offenders could or could not be explained by social disorganization through decomposing the estimated risk map are helpful for dealing with juvenile offenders in the region. Results prompt discussion into geographically targeted police services and young offender placement pertaining to risk of recidivism. We discuss possible reasons for differences and similarities between the previous findings (that analyzed offense data and/or were conducted at a smaller map scale) and our findings, limitations of our study, and practical outcomes of this research from a law enforcement perspective.

  19. A data-driven approach to mapping cortical and subcortical intrinsic functional connectivity along the longitudinal hippocampal axis.

    PubMed

    Blessing, Esther M; Beissner, Florian; Schumann, Andy; Brünner, Franziska; Bär, Karl-Jürgen

    2016-02-01

    The hippocampus (HPC) is functionally heterogeneous along the longitudinal anterior-posterior axis. In rodent models, gene expression maps define at least three discrete longitudinal subregions, which also differ in function, and in anatomical connectivity with the rest of the brain. In humans, equivalent HPC subregions are less well defined, resulting in a lack of consensus in neuroimaging approaches that limits translational study. This study determined whether a data-driven analysis, namely independent component analysis (ICA), could reproducibly define human HPC subregions, and map their respective intrinsic functional connectivity (iFC) with the rest of the brain. Specifically, we performed ICA of resting-state fMRI activity spatially restricted within the HPC, to determine the configuration and reproducibility of functional HPC components. Using dual regression, we then performed multivariate analysis of iFC between resulting HPC components and the whole brain, including detailed connectivity with the hypothalamus, a functionally important connection not yet characterized in human. We found hippocampal ICA resulted in highly reproducible longitudinally discrete components, with greater functional heterogeneity in the anterior HPC, consistent with animal models. Anterior hippocampal components shared iFC with the amygdala, nucleus accumbens, medial prefrontal cortex, posterior cingulate cortex, midline thalamus, and periventricular hypothalamus, whereas posterior hippocampal components shared iFC with the anterior cingulate cortex, retrosplenial cortex, and mammillary bodies. We show that spatially masked hippocampal ICA with dual regression reproducibly identifies functional subregions in the human HPC, and maps their respective brain intrinsic connectivity. Hum Brain Mapp 37:462-476, 2016. © 2015 Wiley Periodicals, Inc. PMID:26538342

  20. Multi-Sensor Approach to Mapping Snow Cover Using Data From NASA's EOS Aqua and Terra Spacecraft

    NASA Astrophysics Data System (ADS)

    Armstrong, R. L.; Brodzik, M. J.

    2003-12-01

    Snow cover is an important variable for climate and hydrologic models due to its effects on energy and moisture budgets. Over the past several decades both optical and passive microwave satellite data have been utilized for snow mapping at the regional to global scale. For the period 1978 to 2002, we have shown earlier that both passive microwave and visible data sets indicate a similar pattern of inter-annual variability, although the maximum snow extents derived from the microwave data are, depending on season, less than those provided by the visible satellite data and the visible data typically show higher monthly variability. Snow mapping using optical data is based on the magnitude of the surface reflectance while microwave data can be used to identify snow cover because the microwave energy emitted by the underlying soil is scattered by the snow grains resulting in a sharp decrease in brightness temperature and a characteristic negative spectral gradient. Our previous work has defined the respective advantages and disadvantages of these two types of satellite data for snow cover mapping and it is clear that a blended product is optimal. We present a multi-sensor approach to snow mapping based both on historical data as well as data from current NASA EOS sensors. For the period 1978 to 2002 we combine data from the NOAA weekly snow charts with passive microwave data from the SMMR and SSM/I brightness temperature record. For the current and future time period we blend MODIS and AMSR-E data sets. An example of validation at the brightness temperature level is provided through the comparison of AMSR-E with data from the well-calibrated heritage SSM/I sensor over a large homogeneous snow-covered surface (Dome C, Antarctica). Prototype snow cover maps from AMSR-E compare well with maps derived from SSM/I. Our current blended product is being developed in the 25 km EASE-Grid while the MODIS data being used are in the Climate Modelers Grid (CMG) at approximately 5 km

  1. Quantum dynamics of relativistic bosons through nonminimal vector square potentials

    NASA Astrophysics Data System (ADS)

    de Oliveira, Luiz P.

    2016-09-01

    The dynamics of relativistic bosons (scalar and vectorial) through nonminimal vector square (well and barrier) potentials is studied in the Duffin-Kemmer-Petiau (DKP) formalism. We show that the problem can be mapped in effective Schrödinger equations for a component of the DKP spinor. An oscillatory transmission coefficient is found and there is total reflection. Additionally, the energy spectrum of bound states is obtained and reveals the Schiff-Snyder-Weinberg effect, for specific conditions the potential lodges bound states of particles and antiparticles.

  2. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  3. Reconstructing mitochondrial genomes directly from genomic next-generation sequencing reads—a baiting and iterative mapping approach

    PubMed Central

    Hahn, Christoph; Bachmann, Lutz; Chevreux, Bastien

    2013-01-01

    We present an in silico approach for the reconstruction of complete mitochondrial genomes of non-model organisms directly from next-generation sequencing (NGS) data—mitochondrial baiting and iterative mapping (MITObim). The method is straightforward even if only (i) distantly related mitochondrial genomes or (ii) mitochondrial barcode sequences are available as starting-reference sequences or seeds, respectively. We demonstrate the efficiency of the approach in case studies using real NGS data sets of the two monogenean ectoparasites species Gyrodactylus thymalli and Gyrodactylus derjavinoides including their respective teleost hosts European grayling (Thymallus thymallus) and Rainbow trout (Oncorhynchus mykiss). MITObim appeared superior to existing tools in terms of accuracy, runtime and memory requirements and fully automatically recovered mitochondrial genomes exceeding 99.5% accuracy from total genomic DNA derived NGS data sets in <24 h using a standard desktop computer. The approach overcomes the limitations of traditional strategies for obtaining mitochondrial genomes for species with little or no mitochondrial sequence information at hand and represents a fast and highly efficient in silico alternative to laborious conventional strategies relying on initial long-range PCR. We furthermore demonstrate the applicability of MITObim for metagenomic/pooled data sets using simulated data. MITObim is an easy to use tool even for biologists with modest bioinformatics experience. The software is made available as open source pipeline under the MIT license at https://github.com/chrishah/MITObim. PMID:23661685

  4. Experimental Approach to Controllably Vary Protein Oxidation While Minimizing Electrode Adsorption for Boron-Doped Diamond Electrochemical Surface Mapping Applications

    SciTech Connect

    McClintock, Carlee; Hettich, Robert {Bob} L

    2013-01-01

    Oxidative protein surface mapping has become a powerful approach for measuring the solvent accessibility of folded protein structures. A variety of techniques exist for generating the key reagent hydroxyl radicals for these measurements; however, many of these approaches require use of radioactive sources or caustic oxidizing chemicals. The purpose of this research was to evaluate and optimize the use of boron-doped diamond (BDD) electrochemistry as a highly accessible tool for producing hydroxyl radicals as a means to induce a controllable level of oxidation on a range of intact proteins. These experiments utilize a relatively high flow rates to reduce protein residence time inside the electrochemical flow chamber, along with a unique cell activation approach to improve control over the intact protein oxidation yield. Studies were conducted to evaluate the level of protein adsorption onto the electrode surface. This report demonstrates a robust protocol for the use of BDD electrochemistry and high performance LC-MS/MS as a high-throughput experimental pipeline for probing higher order protein structure, and illustrates how it is complementary to predictive computational modeling efforts.

  5. A Robust Approach for a Filter-Based Monocular Simultaneous Localization and Mapping (SLAM) System

    PubMed Central

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-01-01

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes. PMID:23823972

  6. Hierarchical Bayesian approach for estimating physical properties in spiral galaxies: Age Maps for M74

    NASA Astrophysics Data System (ADS)

    Sánchez Gil, M. Carmen; Berihuete, Angel; Alfaro, Emilio J.; Pérez, Enrique; Sarro, Luis M.

    2015-09-01

    One of the fundamental goals of modern Astronomy is to estimate the physical parameters of galaxies from images in different spectral bands. We present a hierarchical Bayesian model for obtaining age maps from images in the Ha line (taken with Taurus Tunable Filter (TTF)), ultraviolet band (far UV or FUV, from GALEX) and infrared bands (24, 70 and 160 microns (μm), from Spitzer). As shown in [1], we present the burst ages for young stellar populations in the nearby and nearly face on galaxy M74. As it is shown in the previous work, the Hα to FUV flux ratio gives a good relative indicator of very recent star formation history (SFH). As a nascent star-forming region evolves, the Ha line emission declines earlier than the UV continuum, leading to a decrease in the HαFUV ratio. Through a specific star-forming galaxy model (Starburst 99, SB99), we can obtain the corresponding theoretical ratio Hα / FUV to compare with our observed flux ratios, and thus to estimate the ages of the observed regions. Due to the nature of the problem, it is necessary to propose a model of high complexity to take into account the mean uncertainties, and the interrelationship between parameters when the Hα / FUV flux ratio mentioned above is obtained. To address the complexity of the model, we propose a Bayesian hierarchical model, where a joint probability distribution is defined to determine the parameters (age, metallicity, IMF), from the observed data, in this case the observed flux ratios Hα / FUV. The joint distribution of the parameters is described through an i.i.d. (independent and identically distributed random variables), generated through MCMC (Markov Chain Monte Carlo) techniques.

  7. A robust approach for a filter-based monocular simultaneous localization and mapping (SLAM) system.

    PubMed

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-07-03

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes.

  8. A robust approach for a filter-based monocular simultaneous localization and mapping (SLAM) system.

    PubMed

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-01-01

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes. PMID:23823972

  9. High-Resolution Association Mapping of Quantitative Trait Loci: A Population-Based Approach

    PubMed Central

    Fan, Ruzong; Jung, Jeesun; Jin, Lei

    2006-01-01

    In this article, population-based regression models are proposed for high-resolution linkage disequilibrium mapping of quantitative trait loci (QTL). Two regression models, the “genotype effect model” and the “additive effect model,” are proposed to model the association between the markers and the trait locus. The marker can be either diallelic or multiallelic. If only one marker is used, the method is similar to a classical setting by Nielsen and Weir, and the additive effect model is equivalent to the haplotype trend regression (HTR) method by Zaykin et al. If two/multiple marker data with phase ambiguity are used in the analysis, the proposed models can be used to analyze the data directly. By analytical formulas, we show that the genotype effect model can be used to model the additive and dominance effects simultaneously; the additive effect model takes care of the additive effect only. On the basis of the two models, F-test statistics are proposed to test association between the QTL and markers. By a simulation study, we show that the two models have reasonable type I error rates for a data set of moderate sample size. The noncentrality parameter approximations of F-test statistics are derived to make power calculation and comparison. By a simulation study, it is found that the noncentrality parameter approximations of F-test statistics work very well. Using the noncentrality parameter approximations, we compare the power of the two models with that of the HTR. In addition, a simulation study is performed to make a comparison on the basis of the haplotype frequencies of 10 SNPs of angiotensin-1 converting enzyme (ACE) genes. PMID:16172503

  10. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    SciTech Connect

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-12-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 {sup 32}P- or {sup 33}P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation.

  11. SBH and the integration of complementary approaches in the mapping, sequencing, and understanding of complex genomes

    SciTech Connect

    Drmanac, R.; Drmanac, S.; Labat, I.; Vicentic, A.; Gemmell, A.; Stavropoulos, N.; Jarvis, J.

    1992-01-01

    A variant of sequencing by hybridization (SBH) is being developed with a potential to inexpensively determine up to 100 million base pairs per year. The method comprises (1) arraying short clones in 864-well plates; (2) growth of the M13 clones or PCR of the inserts; (3) automated spotting of DNAs by corresponding pin-arrays; (4) hybridization of dotted samples with 200-3000 [sup 32]P- or [sup 33]P-labeled 6- to 8-mer probes; and (5) scoring hybridization signals using storage phosphor plates. Some 200 7- to 8-mers can provide an inventory of the genes if CDNA clones are hybridized, or can define the order of 2-kb genomic clones, creating physical and structural maps with 100-bp resolution; the distribution of G+C, LINEs, SINEs, and gene families would be revealed. cDNAs that represent new genes and genomic clones in regions of interest selected by SBH can be sequenced by a gel method. Uniformly distributed clones from the previous step will be hybridized with 2000--3000 6- to 8-mers. As a result, approximately 50--60% of the genomic regions containing members of large repetitive and gene families and those families represented in GenBank would be completely sequenced. In the less redundant regions, every base pair is expected to be read with 3-4 probes, but the complete sequence can not be reconstructed. Such partial sequences allow the inference of similarity and the recognition of coding, regulatory, and repetitive sequences, as well as study of the evolutionary processes all the way up to the species delineation.

  12. Resonances at the LHC beyond the Higgs boson: The scalar/tensor case

    NASA Astrophysics Data System (ADS)

    Kilian, Wolfgang; Ohl, Thorsten; Reuter, Jürgen; Sekulla, Marco

    2016-02-01

    We study in a bottom-up approach the theoretically consistent description of additional resonances in the electroweak sector beyond the discovered Higgs boson as simplified models. We focus on scalar and tensor resonances. Our formalism is suited for strongly coupled models, but can also be applied to weakly interacting theories. The spurious degrees of freedom of tensor resonances that would lead to bad high-energy behavior are treated using a generalization of the Stückelberg formalism. We calculate scattering amplitudes for vector-boson and Higgs boson pairs. The high-energy region is regulated by the T-matrix unitarization procedure, leading to amplitudes that are well behaved on the whole phase space. We present numerical results for complete partonic processes that involve resonant vector-boson scattering for the current and upcoming runs of LHC.

  13. An ensemble pansharpening approach for finer-scale mapping of sugarcane with Landsat 8 imagery

    NASA Astrophysics Data System (ADS)

    Johnson, Brian A.; Scheyvens, Henry; Shivakoti, Binaya R.

    2014-12-01

    We tested the effects of three fast pansharpening methods - Intensity-Hue-Saturation (IHS), Brovey Transform (BT), and Additive Wavelet Transform (AWT) - on sugarcane classification in a Landsat 8 image (bands 1-7), and proposed two ensemble pansharpening approaches (band stacking and band averaging) which combine the pixel-level information of multiple pansharpened images for classification. To test the proposed ensemble pansharpening approaches, we classified “sugarcane” and “other” land cover in the unsharpened Landsat multispectral image, the individual pansharpened images, and the band-stacked and band-averaged ensemble images using Support Vector Machines (SVM), and assessed the classification accuracy of each image. Of the individual pansharpened images, the AWT image achieved higher classification accuracy than the unsharpened image, while the IHS and BT images did not. The band-stacked ensemble images achieved higher classification accuracies than the unsharpened and individual pansharpened images, with the IHS-BT-AWT band-stacked image producing the most accurate classification result, followed by the IHS-BT band-stacked image. The ensemble images containing averaged pixel values from multiple pansharpened images achieved lower classification accuracies than the band-stacked ensemble images, but most still had higher accuracies than the unsharpened and individual pansharpened results. Our results indicate that ensemble pansharpening approaches have the potential to increase classification accuracy, at least for relatively simple classification tasks. Based on the results of the study, we recommend further investigation of ensemble pansharpening for image analysis (e.g. classification and regression tasks) in agricultural and non-agricultural environments.

  14. Mapping Seasonal Evapotranspiration and Root Zone Soil Moisture using a Hybrid Modeling Approach over Vineyards

    NASA Astrophysics Data System (ADS)

    Geli, H. M. E.

    2015-12-01

    Estimates of actual crop evapotranspiration (ETa) at field scale over the growing season are required for improving agricultural water management, particularly in water limited and drought prone regions. Remote sensing data from multiple platforms such as airborne and Landsat-based sensors can be used to provide these estimates. Combining these data with surface energy balance models can provide ETa estimates at sub- field scale as well as information on vegetation stress and soil moisture conditions. However, the temporal resolution of airborne and Landsat data does not allow for a continuous ETa monitoring over the course of the growing season. This study presents the application of a hybrid ETa modeling approach developed for monitoring daily ETa and root zone available water at high spatial resolutions. The hybrid ETa modeling approach couples a thermal-based energy balance model with a water balance-based scheme using data assimilation. The two source energy balance (TSEB) model is used to estimate instantaneous ETa which can be extrapolated to daily ETa using a water balance model modified to use the reflectance-based basal crop coefficient for interpolating ETa in between airborne and/or Landsat overpass dates. Moreover, since it is a water balance model, the soil moisture profile is also estimated. The hybrid ETa approach is applied over vineyard fields in central California. High resolution airborne and Landsat imagery were used to drive the hybrid model. These images were collected during periods that represented different vine phonological stages in 2013 growing season. Estimates of daily ETa and surface energy balance fluxes will be compared with ground-based eddy covariance tower measurements. Estimates of soil moisture at multiple depths will be compared with measurements.

  15. A multivariate approach for mapping fire ignition risk: the example of the National Park of Cilento (southern Italy).

    PubMed

    Guglietta, Daniela; Migliozzi, Antonello; Ricotta, Carlo

    2015-07-01

    Recent advances in fire management led landscape managers to adopt an integrated fire fighting strategy in which fire suppression is supported by prevention actions and by knowledge of local fire history and ecology. In this framework, an accurate evaluation of fire ignition risk and its environmental drivers constitutes a basic step toward the optimization of fire management measures. In this paper, we propose a multivariate method for identifying and spatially portraying fire ignition risk across a complex and heterogeneous landscape such as the National Park of Cilento, Vallo di Diano, and Alburni (southern Italy). The proposed approach consists first in calculating the fire selectivity of several landscape features that are usually related to fire ignition, such as land cover or topography. Next, the fire selectivity values of single landscape features are combined with multivariate segmentation tools. The resulting fire risk map may constitute a valuable tool for optimizing fire prevention strategies and for efficiently allocating fire fighting resources.

  16. Localization of causal locus in the genome of the brown macroalga Ectocarpus: NGS-based mapping and positional cloning approaches.

    PubMed

    Billoud, Bernard; Jouanno, Émilie; Nehr, Zofia; Carton, Baptiste; Rolland, Élodie; Chenivesse, Sabine; Charrier, Bénédicte

    2015-01-01

    Mutagenesis is the only process by which unpredicted biological gene function can be identified. Despite that several macroalgal developmental mutants have been generated, their causal mutation was never identified, because experimental conditions were not gathered at that time. Today, progresses in macroalgal genomics and judicious choices of suitable genetic models make mutated gene identification possible. This article presents a comparative study of two methods aiming at identifying a genetic locus in the brown alga Ectocarpus siliculosus: positional cloning and Next-Generation Sequencing (NGS)-based mapping. Once necessary preliminary experimental tools were gathered, we tested both analyses on an Ectocarpus morphogenetic mutant. We show how a narrower localization results from the combination of the two methods. Advantages and drawbacks of these two approaches as well as potential transfer to other macroalgae are discussed.

  17. Exploring the Development of Existing Sex Education Programmes for People with Intellectual Disabilities: An Intervention Mapping Approach

    PubMed Central

    Schaafsma, Dilana; Stoffelen, Joke M T; Kok, Gerjo; Curfs, Leopold M G

    2013-01-01

    Background People with intellectual disabilities face barriers that affect their sexual health. Sex education programmes have been developed by professionals working in the field of intellectual disabilities with the aim to overcome these barriers. The aim of this study was to explore the development of these programmes. Methods Sex education programmes geared to people with intellectual disabilities were examined in the context of the Intervention Mapping protocol. Data were obtained via interviews with the programme developers. Results All programmes lack specific programme outcomes, do not have a theoretical basis, did not involve members of relevant groups in the development process and lack systematic evaluation. Conclusions Based on our findings and the literature, we conclude that these programmes are unlikely to be effective. Future programmes should be developed using a more systematic and theory- and evidence-based approach. PMID:23280605

  18. A Multivariate Approach for Mapping Fire Ignition Risk: The Example of the National Park of Cilento (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Guglietta, Daniela; Migliozzi, Antonello; Ricotta, Carlo

    2015-07-01

    Recent advances in fire management led landscape managers to adopt an integrated fire fighting strategy in which fire suppression is supported by prevention actions and by knowledge of local fire history and ecology. In this framework, an accurate evaluation of fire ignition risk and its environmental drivers constitutes a basic step toward the optimization of fire management measures. In this paper, we propose a multivariate method for identifying and spatially portraying fire ignition risk across a complex and heterogeneous landscape such as the National Park of Cilento, Vallo di Diano, and Alburni (southern Italy). The proposed approach consists first in calculating the fire selectivity of several landscape features that are usually related to fire ignition, such as land cover or topography. Next, the fire selectivity values of single landscape features are combined with multivariate segmentation tools. The resulting fire risk map may constitute a valuable tool for optimizing fire prevention strategies and for efficiently allocating fire fighting resources.

  19. Intervention Mapping as a Participatory Approach to Developing an HIV prevention Intervention in Rural African American Communities

    PubMed Central

    Corbie-Smith, Giselle; Akers, Aletha; Blumenthal, Connie; Council, Barbara; Wynn, Mysha; Muhammad, Melvin; Stith, Doris

    2011-01-01

    Southeastern states are among the hardest hit by the HIV epidemic in this country, and racial disparities in HIV rates are high in this region. This is particularly true in our communities of interest in rural eastern North Carolina. Although most recent efforts to prevent HIV attempt to address multiple contributing factors, we have found few multilevel HIV interventions that have been developed, tailored or tested in rural communities for African Americans. We describe how Project GRACE integrated Intervention Mapping (IM) methodology with community based participatory research (CBPR) principles to develop a multi-level, multi-generational HIV prevention intervention. IM was carried out in a series of steps from review of relevant data through producing program components. Through the IM process, all collaborators agreed that we needed a family-based intervention involving youth and their caregivers. We found that the structured approach of IM can be adapted to incorporate the principles of CBPR. PMID:20528128

  20. Diffusion-Based Density-Equalizing Maps: an Interdisciplinary Approach to Visualizing Homicide Rates and Other Georeferenced Statistical Data

    NASA Astrophysics Data System (ADS)

    Mazzitello, Karina I.; Candia, Julián

    2012-12-01

    In every country, public and private agencies allocate extensive funding to collect large-scale statistical data, which in turn are studied and analyzed in order to determine local, regional, national, and international policies regarding all aspects relevant to the welfare of society. One important aspect of that process is the visualization of statistical data with embedded geographical information, which most often relies on archaic methods such as maps colored according to graded scales. In this work, we apply nonstandard visualization techniques based on physical principles. We illustrate the method with recent statistics on homicide rates in Brazil and their correlation to other publicly available data. This physics-based approach provides a novel tool that can be used by interdisciplinary teams investigating statistics and model projections in a variety of fields such as economics and gross domestic product research, public health and epidemiology, sociodemographics, political science, business and marketing, and many others.

  1. Localization of causal locus in the genome of the brown macroalga Ectocarpus: NGS-based mapping and positional cloning approaches

    PubMed Central

    Billoud, Bernard; Jouanno, Émilie; Nehr, Zofia; Carton, Baptiste; Rolland, Élodie; Chenivesse, Sabine; Charrier, Bénédicte

    2015-01-01

    Mutagenesis is the only process by which unpredicted biological gene function can be identified. Despite that several macroalgal developmental mutants have been generated, their causal mutation was never identified, because experimental conditions were not gathered at that time. Today, progresses in macroalgal genomics and judicious choices of suitable genetic models make mutated gene identification possible. This article presents a comparative study of two methods aiming at identifying a genetic locus in the brown alga Ectocarpus siliculosus: positional cloning and Next-Generation Sequencing (NGS)-based mapping. Once necessary preliminary experimental tools were gathered, we tested both analyses on an Ectocarpus morphogenetic mutant. We show how a narrower localization results from the combination of the two methods. Advantages and drawbacks of these two approaches as well as potential transfer to other macroalgae are discussed. PMID:25745426

  2. A scalable and accurate method for classifying protein-ligand binding geometries using a MapReduce approach.

    PubMed

    Estrada, T; Zhang, B; Cicotti, P; Armen, R S; Taufer, M

    2012-07-01

    We present a scalable and accurate method for classifying protein-ligand binding geometries in molecular docking. Our method is a three-step process: the first step encodes the geometry of a three-dimensional (3D) ligand conformation into a single 3D point in the space; the second step builds an octree by assigning an octant identifier to every single point in the space under consideration; and the third step performs an octree-based clustering on the reduced conformation space and identifies the most dense octant. We adapt our method for MapReduce and implement it in Hadoop. The load-balancing, fault-tolerance, and scalability in MapReduce allow screening of very large conformation spaces not approachable with traditional clustering methods. We analyze results for docking trials for 23 protein-ligand complexes for HIV protease, 21 protein-ligand complexes for Trypsin, and 12 protein-ligand complexes for P38alpha kinase. We also analyze cross docking trials for 24 ligands, each docking into 24 protein conformations of the HIV protease, and receptor ensemble docking trials for 24 ligands, each docking in a pool of HIV protease receptors. Our method demonstrates significant improvement over energy-only scoring for the accurate identification of native ligand geometries in all these docking assessments. The advantages of our clustering approach make it attractive for complex applications in real-world drug design efforts. We demonstrate that our method is particularly useful for clustering docking results using a minimal ensemble of representative protein conformational states (receptor ensemble docking), which is now a common strategy to address protein flexibility in molecular docking. PMID:22658682

  3. Operational evapotranspiration mapping using remote sensing and weather datasets: a new parameterization for the SSEB approach

    USGS Publications Warehouse

    Senay, Gabriel B.; Bohms, Stefanie; Singh, Ramesh K.; Gowda, Prasanna H.; Velpuri, Naga Manohar; Alemu, Henok; Verdin, James P.

    2013-01-01

    The increasing availability of multi-scale remotely sensed data and global weather datasets is allowing the estimation of evapotranspiration (ET) at multiple scales. We present a simple but robust method that uses remotely sensed thermal data and model-assimilated weather fields to produce ET for the contiguous United States (CONUS) at monthly and seasonal time scales. The method is based on the Simplified Surface Energy Balance (SSEB) model, which is now parameterized for operational applications, renamed as SSEBop. The innovative aspect of the SSEBop is that it uses predefined boundary conditions that are unique to each pixel for the "hot" and "cold" reference conditions. The SSEBop model was used for computing ET for 12 years (2000-2011) using the MODIS and Global Data Assimilation System (GDAS) data streams. SSEBop ET results compared reasonably well with monthly eddy covariance ET data explaining 64% of the observed variability across diverse ecosystems in the CONUS during 2005. Twelve annual ET anomalies (2000-2011) depicted the spatial extent and severity of the commonly known drought years in the CONUS. More research is required to improve the representation of the predefined boundary conditions in complex terrain at small spatial scales. SSEBop model was found to be a promising approach to conduct water use studies in the CONUS, with a similar opportunity in other parts of the world. The approach can also be applied with other thermal sensors such as Landsat.

  4. Mapping anhedonia-specific dysfunction in a transdiagnostic approach: an ALE meta-analysis

    PubMed Central

    Zhang, Bei; Lin, Pan; Shi, Huqing; Öngür, Dost; Auerbach, Randy P.; Wang, Xiaosheng; Yao, Shuqiao

    2015-01-01

    Anhedonia is a prominent symptom in neuropsychiatric disorders, most markedly in major depressive disorder (MDD) and schizophrenia (SZ). Emerging evidence indicates an overlap in the neural substrates of anhedonia between MDD and SZ, which supported a transdiagnostic approach. Therefore, we used activation likelihood estimation (ALE) meta-analysis of functional magnetic resonance imaging studies in MDD and SZ to examine the neural bases of three subdomains of anhedonia: consummatory anhedonia, anticipatory anhedonia and emotional processing. ALE analysis focused specifically on MDD or SZ was used later to dissociate specific anhedonia-related neurobiological impairments from potential disease general impairments. ALE results revealed that consummatory anhedonia was associated with decreased activation in ventral basal ganglia areas, while anticipatory anhedonia was associated with more substrates in frontal-striatal networks except the ventral striatum, which included the dorsal anterior cingulate, middle frontal gyrus and medial frontal gyrus. MDD and SZ patients showed similar neurobiological impairments in anticipatory and consummatory anhedonia, but differences in the emotional experience task, which may also involve affective/mood general processing. These results support that anhedonia is characterized by alterations in reward processing and relies on frontal-striatal brain circuitry. The transdiagnostic approach is a promising way to reveal the overall neurobiological framework that contributes to anhedonia and could help to improve targeted treatment strategies. PMID:26487590

  5. Fixed target combined with spectral mapping: approaching 100% hit rates for serial crystallography.

    PubMed

    Oghbaey, Saeed; Sarracini, Antoine; Ginn, Helen M; Pare-Labrosse, Olivier; Kuo, Anling; Marx, Alexander; Epp, Sascha W; Sherrell, Darren A; Eger, Bryan T; Zhong, Yinpeng; Loch, Rolf; Mariani, Valerio; Alonso-Mori, Roberto; Nelson, Silke; Lemke, Henrik T; Owen, Robin L; Pearson, Arwen R; Stuart, David I; Ernst, Oliver P; Mueller-Werkmeister, Henrike M; Miller, R J Dwayne

    2016-08-01

    The advent of ultrafast highly brilliant coherent X-ray free-electron laser sources has driven the development of novel structure-determination approaches for proteins, and promises visualization of protein dynamics on sub-picosecond timescales with full atomic resolution. Significant efforts are being applied to the development of sample-delivery systems that allow these unique sources to be most efficiently exploited for high-throughput serial femtosecond crystallography. Here, the next iteration of a fixed-target crystallography chip designed for rapid and reliable delivery of up to 11 259 protein crystals with high spatial precision is presented. An experimental scheme for predetermining the positions of crystals in the chip by means of in situ spectroscopy using a fiducial system for rapid, precise alignment and registration of the crystal positions is presented. This delivers unprecedented performance in serial crystallography experiments at room temperature under atmospheric pressure, giving a raw hit rate approaching 100% with an effective indexing rate of approximately 50%, increasing the efficiency of beam usage and allowing the method to be applied to systems where the number of crystals is limited.

  6. Fast approach to evaluate map reconstruction for lesion detection and localization

    SciTech Connect

    Qi, Jinyi; Huesman, Ronald H.

    2004-02-01

    Lesion detection is an important task in emission tomography. Localization ROC (LROC) studies are often used to analyze the lesion detection and localization performance. Most researchers rely on Monte Carlo reconstruction samples to obtain LROC curves, which can be very time-consuming for iterative algorithms. In this paper we develop a fast approach to obtain LROC curves that does not require Monte Carlo reconstructions. We use a channelized Hotelling observer model to search for lesions, and the results can be easily extended to other numerical observers. We theoretically analyzed the mean and covariance of the observer output. Assuming the observer outputs are multivariate Gaussian random variables, an LROC curve can be directly generated by integrating the conditional probability density functions. The high-dimensional integrals are calculated using a Monte Carlo method. The proposed approach is very fast because no iterative reconstruction is involved. Computer simulations show that the results of the proposed method match well with those obtained using the tradition LROC analysis.

  7. Fixed target combined with spectral mapping: approaching 100% hit rates for serial crystallography.

    PubMed

    Oghbaey, Saeed; Sarracini, Antoine; Ginn, Helen M; Pare-Labrosse, Olivier; Kuo, Anling; Marx, Alexander; Epp, Sascha W; Sherrell, Darren A; Eger, Bryan T; Zhong, Yinpeng; Loch, Rolf; Mariani, Valerio; Alonso-Mori, Roberto; Nelson, Silke; Lemke, Henrik T; Owen, Robin L; Pearson, Arwen R; Stuart, David I; Ernst, Oliver P; Mueller-Werkmeister, Henrike M; Miller, R J Dwayne

    2016-08-01

    The advent of ultrafast highly brilliant coherent X-ray free-electron laser sources has driven the development of novel structure-determination approaches for proteins, and promises visualization of protein dynamics on sub-picosecond timescales with full atomic resolution. Significant efforts are being applied to the development of sample-delivery systems that allow these unique sources to be most efficiently exploited for high-throughput serial femtosecond crystallography. Here, the next iteration of a fixed-target crystallography chip designed for rapid and reliable delivery of up to 11 259 protein crystals with high spatial precision is presented. An experimental scheme for predetermining the positions of crystals in the chip by means of in situ spectroscopy using a fiducial system for rapid, precise alignment and registration of the crystal positions is presented. This delivers unprecedented performance in serial crystallography experiments at room temperature under atmospheric pressure, giving a raw hit rate approaching 100% with an effective indexing rate of approximately 50%, increasing the efficiency of beam usage and allowing the method to be applied to systems where the number of crystals is limited. PMID:27487825

  8. An Unbiased Systems Genetics Approach to Mapping Genetic Loci Modulating Susceptibility to Severe Streptococcal Sepsis

    PubMed Central

    Abdeltawab, Nourtan F.; Aziz, Ramy K.; Kansal, Rita; Rowe, Sarah L.; Su, Yin; Gardner, Lidia; Brannen, Charity; Nooh, Mohammed M.; Attia, Ramy R.; Abdelsamed, Hossam A.; Taylor, William L.; Lu, Lu; Williams, Robert W.; Kotb, Malak

    2008-01-01

    Striking individual differences in severity of group A streptococcal (GAS) sepsis have been noted, even among patients infected with the same bacterial strain. We had provided evidence that HLA class II allelic variation contributes significantly to differences in systemic disease severity by modulating host responses to streptococcal superantigens. Inasmuch as the bacteria produce additional virulence factors that participate in the pathogenesis of this complex disease, we sought to identify additional gene networks modulating GAS sepsis. Accordingly, we applied a systems genetics approach using a panel of advanced recombinant inbred mice. By analyzing disease phenotypes in the context of mice genotypes we identified a highly significant quantitative trait locus (QTL) on Chromosome 2 between 22 and 34 Mb that strongly predicts disease severity, accounting for 25%–30% of variance. This QTL harbors several polymorphic genes known to regulate immune responses to bacterial infections. We evaluated candidate genes within this QTL using multiple parameters that included linkage, gene ontology, variation in gene expression, cocitation networks, and biological relevance, and identified interleukin1 alpha and prostaglandin E synthases pathways as key networks involved in modulating GAS sepsis severity. The association of GAS sepsis with multiple pathways underscores the complexity of traits modulating GAS sepsis and provides a powerful approach for analyzing interactive traits affecting outcomes of other infectious diseases. PMID:18421376

  9. Landau-Yang theorem and decays of a Z' boson into two Z bosons.

    PubMed

    Keung, Wai-Yee; Low, Ian; Shu, Jing

    2008-08-29

    We study the decay of a Z' boson into two Z bosons by extending the Landau-Yang theorem to a parent particle decaying into two Z bosons. For a spin-1 parent the theorem predicts that (1) there are only two possible couplings and (2) the normalized differential cross section depends on kinematics only through a phase shift in the azimuthal angle between the two decay planes of the Z boson. When the parent is a Z' the two possible couplings are anomaly induced and CP violating, respectively. At the CERN Large Hadron Collider their effects could be disentangled when both Z bosons decay leptonically. PMID:18851602

  10. Fuzzy cognitive map in differential diagnosis of alterations in urinary elimination: A nursing approach

    PubMed Central

    de Moraes Lopes, Maria Helena Baena; Ortega, Neli Regina Siqueira; Silveira, Paulo Sérgio Panse; Massad, Eduardo; Higa, Rosângela; de Fátima Marin, Heimar

    2013-01-01

    Purpose To develop a decision support system to discriminate the diagnoses of alterations in urinary elimination, according to the nursing terminology of NANDA International (NANDA-I). Methods A fuzzy cognitive map (FCM) was structured considering six possible diagnoses: stress urinary incontinence, reflex urinary incontinence, urge urinary incontinence, functional urinary incontinence, total urinary incontinence and urinary retention; and 39 signals associated with them. The model was implemented in Microsoft Visual C++® Edition 2005 and applied in 195 real cases. Its performance was evaluated through the agreement test, comparing its results with the diagnoses determined by three experts (nurses). The sensitivity and specificity of the model were calculated considering the expert’s opinion as a gold standard. In order to compute the Kappa’s values we considered two situations, since more than one diagnosis was possible: the overestimation of the accordance in which the case was considered as concordant when at least one diagnoses was equal; and the underestimation of the accordance, in which the case was considered as discordant when at least one diagnosis was different. Results The overestimation of the accordance showed an excellent agreement (kappa = 0.92, p < 0.0001); and the underestimation provided a moderate agreement (kappa = 0.42, p < 0.0001). In general the FCM model showed high sensitivity and specificity, of 0.95 and 0.92, respectively, but provided a low specificity value in determining the diagnosis of urge urinary incontinence (0.43) and a low sensitivity value to total urinary incontinence (0.42). Conclusions The decision support system developed presented a good performance compared to other types of expert systems for differential diagnosis of alterations in urinary elimination. Since there are few similar studies in the literature, we are convinced of the importance of investing in this kind of modeling, both from the theoretical and from

  11. Using dose-surface maps to predict radiation-induced rectal bleeding: a neural network approach.

    PubMed

    Buettner, Florian; Gulliford, Sarah L; Webb, Steve; Partridge, Mike

    2009-09-01

    The incidence of late-toxicities after radiotherapy can be modelled based on the dose delivered to the organ under consideration. Most predictive models reduce the dose distribution to a set of dose-volume parameters and do not take the spatial distribution of the dose into account. The aim of this study was to develop a classifier predicting radiation-induced rectal bleeding using all available information on the dose to the rectal wall. The dose was projected on a two-dimensional dose-surface map (DSM) by virtual rectum-unfolding. These DSMs were used as inputs for a classification method based on locally connected neural networks. In contrast to fully connected conventional neural nets, locally connected nets take the topology of the input into account. In order to train the nets, data from 329 patients from the RT01 trial (ISRCTN 47772397) were split into ten roughly equal parts. By using nine of these parts as a training set and the remaining part as an independent test set, a ten-fold cross-validation was performed. Ensemble learning was used and 250 nets were built from randomly selected patients from the training set. Out of these 250 nets, an ensemble of expert nets was chosen. The performances of the full ensemble and of the expert ensemble were quantified by using receiver-operator-characteristic (ROC) curves. In order to quantify the predictive power of the shape, ensembles of fully connected conventional neural nets based on dose-surface histograms (DSHs) were generated and their performances were quantified. The expert ensembles performed better than or equally as well as the full ensembles. The area under the ROC curve for the DSM-based expert ensemble was 0.64. The area under the ROC curve for the DSH-based expert ensemble equalled 0.59. This difference in performance indicates that not only volumetric, but also morphological aspects of the dose distribution are correlated to rectal bleeding after radiotherapy. Thus, the shape of the dose

  12. A Self-Organizing Maps approach to assess the wave climate of the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Barbariol, Francesco; Marcello Falcieri, Francesco; Scotton, Carlotta; Benetazzo, Alvise; Bergamasco, Andrea; Bergamasco, Filippo; Bonaldo, Davide; Carniel, Sandro; Sclavo, Mauro

    2015-04-01

    The assessment of wave conditions at sea is fruitful for many research fields in marine and atmospheric sciences and for the human activities in the marine environment. To this end, in the last decades the observational network, that mostly relies on buoys, satellites and other probes from fixed platforms, has been integrated with numerical models outputs, which allow to compute the parameters of sea states (e.g. the significant wave height, the mean and peak wave periods, the mean and peak wave directions) over wider regions. Apart from the collection of wave parameters observed at specific sites or modeled on arbitrary domains, the data processing performed to infer the wave climate at those sites is a crucial step in order to provide high quality data and information to the community. In this context, several statistical techniques has been used to model the randomness of wave parameters. While univariate and bivariate probability distribution functions (pdf) are routinely used, multivariate pdfs that model the probability structure of more than two wave parameters are hardly managed. Recently, the Self-Organizing Maps (SOM) technique has been successfully applied to represent the multivariate random wave climate at sites around the Iberian peninsula and the South America continent. Indeed, the visualization properties offered by this technique allow to get the dependencies between the different parameters by visual inspection. In this study, carried out in the frame of the Italian National Flagship Project "RITMARE", we take advantage of the SOM technique to assess the multivariate wave climate over the Adriatic Sea, a semi-enclosed basin in the north-eastern Mediterranean Sea, where winds from North-East (called "Bora") and South-East (called "Sirocco") mainly blow causing sea storms. By means of the SOM techniques we can observe the multivariate character of the typical Bora and Sirocco wave features in the Adriatic Sea. To this end, we used both observed and

  13. New Maps for Old: a Topological Approach to "the Faerie Queene" and Shakespeare's History Plays

    NASA Astrophysics Data System (ADS)

    Graney, Kathleen M.

    1994-01-01

    When Nicholas Copernicus published De revolutionibus in 1543, his announced discoveries both displaced humankind from its former place at the center of the universe and enlarged the boundaries of that universe beyond anything that had been imagined before. These discoveries evoked in men and women of the late-sixteenth century a new consciousness of both cosmic space and of psychological spaces within themselves, spaces for self-definition made available by the breakdown of the traditional, hierarchical world view. This re-vision of space is evident in almost every aspect of the culture of Elizabethan England, from its science and art to the accounts of New World voyagers. In the works of Edmund Spenser and William Shakespeare, this spatial awareness manifests itself "topologically" --that is, in the relationship between places in their epic and dramatic works that can be identified as "inside" or "outside" and in the kinds of actions associated with each place. In Books One and Two of The Faerie Queene Spenser uses space both topographically and topologically. He maps the journeys of his knights through Fairyland by means of references to allegorical structures and features of the mythical landscape. At the same time, he contrasts inside spaces, where the knights struggle psychologically to define themselves in terms of certain moral virtues, and outside spaces, where that "self" intersects with Spenser's myth of English history. In his earliest chronicle plays of the 1580s and '90s Shakespeare also depicts English history topographically, as a series of epic confrontations enacted in outside, public spaces bearing familiar place -names. With Richard III, however, he begins to dramatize that history as related to moments of self-discovery achieved by the central character within the privacy of inside spaces and involving some conflict between the values of public and private life. Unlike Spenser, whose characters ultimately define themselves in terms of some value

  14. Using dose-surface maps to predict radiation-induced rectal bleeding: a neural network approach

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike

    2009-09-01

    The incidence of late-toxicities after radiotherapy can be modelled based on the dose delivered to the organ under consideration. Most predictive models reduce the dose distribution to a set of dose-volume parameters and do not take the spatial distribution of the dose into account. The aim of this study was to develop a classifier predicting radiation-induced rectal bleeding using all available information on the dose to the rectal wall. The dose was projected on a two-dimensional dose-surface map (DSM) by virtual rectum-unfolding. These DSMs were used as inputs for a classification method based on locally connected neural networks. In contrast to fully connected conventional neural nets, locally connected nets take the topology of the input into account. In order to train the nets, data from 329 patients from the RT01 trial (ISRCTN 47772397) were split into ten roughly equal parts. By using nine of these parts as a training set and the remaining part as an independent test set, a ten-fold cross-validation was performed. Ensemble learning was used and 250 nets were built from randomly selected patients from the training set. Out of these 250 nets, an ensemble of expert nets was chosen. The performances of the full ensemble and of the expert ensemble were quantified by using receiver-operator-characteristic (ROC) curves. In order to quantify the predictive power of the shape, ensembles of fully connected conventional neural nets based on dose-surface histograms (DSHs) were generated and their performances were quantified. The expert ensembles performed better than or equally as well as the full ensembles. The area under the ROC curve for the DSM-based expert ensemble was 0.64. The area under the ROC curve for the DSH-based expert ensemble equalled 0.59. This difference in performance indicates that not only volumetric, but also morphological aspects of the dose distribution are correlated to rectal bleeding after radiotherapy. Thus, the shape of the dose

  15. Mapping mantle-melting anomalies in Baja California: a combined helium-seismology approach

    NASA Astrophysics Data System (ADS)

    Negrete-Aranda, R.; Spelz, R. M.; Hilton, D. R.; Tellez, M.; González-Yahimovich, O.

    2015-12-01

    In active tectonic settings, the presence of helium in aqueous fluids with 3He/4He ratios greater than in-situ production values (~0.05 RA where RA = air He or 1.4 x 10-6) indicates the contribution of mantle-derived volatiles to the total volatile inventory. This is an indicative of the presence of mantle-derived melts, which act to transfer volatiles from the solid Earth towards the surface. Thus, He has the potential to map regions of the underlying mantle which are undergoing partial melting - a phenomenon which should also be evident in the seismic record. Reports of high 3He/4He in hot springs in Baja California (BC) has prompted us to initiate a survey of the region to assess relationship(s) between He isotopes and geophysical images of the underlying mantle. Previous studies report 3He/4He ratios of 0.54 RA for submarine hot springs (Punta Banda 108oC; Vidal, 1982) and 1.3 RA for spring waters (81oC) at Bahia Concepcion (Forrest et al.,2005). Our new survey of hot springs in northern BC has revealed that all 6 localities sampled to date, show the presence of mantle He with the highest ratio being 1.74RA (21% mantle-derived) at Puertecitos on the Gulf coast. He ratios are generally lower on the Pacific coast with the minimum mantle He contribution being 5% at Sierra Juárez (0.11RA). Thus, preliminary trends are of a west-to-east increase in the mantle He signal across the peninsula. He results presented in this study correlate well with high resolution Rayleigh wave tomography images by Forsythe et al. (2007). Shear velocity variations in the BC crust and upper mantle have been interpreted as low velocity anomalies associated with dynamic upwelling and active melt production. More extensive sampling throughout BC coupled with analysis of other geochemical indicators of mantle degassing (e.g. CO2) will allow more detailed characterization of the extent and distribution of mantle melts in the region, facilitating assessment of the region's geothermal

  16. SDG fermion-pair algebraic SO(12) and Sp(10) models and their boson realizations

    SciTech Connect

    Navatil, P.; Geyer, H.B.; Dobes, J.

    1995-11-01

    It is shown how the boson mapping formalism may be applied as a useful many-body tool to solve a fermion problem. This is done in the context of generalized Ginocchio models for which the authors introduce S-, D-, and G-pairs of fermions and subsequently construct the sdg-boson realizations of the generalized Dyson type. The constructed SO(12) and Sp(10) fermion models are solved beyond the explicit symmetry limits. Phase transitions to rotational structures are obtained also in situations where there is no underlying SU(3) symmetry. 34 refs., 5 figs., 2 tabs.

  17. Scale of the equilibration volume in eclogites: insights from a new micro-mapping approach - Example of Atbashi range, Kyrgyzstan

    NASA Astrophysics Data System (ADS)

    Loury, Chloé; Lanari, Pierre; Rolland, Yann; Guillot, Stéphane; Ganino, Clément

    2014-05-01

    Understanding geodynamic processes in subduction zones and mountains belts relies on the reconstruction of precise pressure-temperature paths (P-T paths) from metamorphic rocks. Most P-T paths are obtained using quantitative thermobarometry such as forward thermodynamics models. The question of the scale of the equilibration volume is of prime importance because its chemistry is used as input for the calculation of P-T sections. In chemically homogeneous rocks the bulk rock may be obtained either by ICP-MS or XRF analysis on whole rocks. For chemically heterogeneous rocks, containing different mineral assemblages and/or a high proportion of zoned minerals, the concept of local effective bulk (LEB) is essential. In the last 10 years, X-ray micro-mapping methods have been developed in this aim. Here we show how standardized X-ray maps can be used to estimate the equilibration volume at the pressure peak in an eclogite sample. The study area lies in the Atbashi range, in Kyrgyzstan, along the South-Tianshan carboniferous suture of the Central Asian Orogenic Belt with the Tarim block. We use the micro-mapping approach to unravel the P-T path of a mafic eclogite containing mm-scale garnet porphyroblasts. Quantitative compositional maps of a garnet and its surrounding matrix are obtained from standardized X-ray maps processed with the XMapTools program (Lanari et al, 2014). By using these maps we measured the LEB corresponding to the different stages of garnet growth. The equilibration volume is then modeled using the local compositions (extrapolated in 3D) combined with Gibbs free energy minimization. Our model suggests that equilibrium conditions are attained for chemistry made of 90% of garnet and 10% of matrix. P-T sections are calculated from the core of the garnet to the rim taking into account the fractionation at each stage of garnet growth by changing the bulk composition. We obtained the following P-T path: (1) garnet core crystallization during prograde stage

  18. A Scalable Genome-Editing-Based Approach for Mapping Multiprotein Complexes in Human Cells.

    PubMed

    Dalvai, Mathieu; Loehr, Jeremy; Jacquet, Karine; Huard, Caroline C; Roques, Céline; Herst, Pauline; Côté, Jacques; Doyon, Yannick

    2015-10-20

    Conventional affinity purification followed by mass spectrometry (AP-MS) analysis is a broadly applicable method used to decipher molecular interaction networks and infer protein function. However, it is sensitive to perturbations induced by ectopically overexpressed target proteins and does not reflect multilevel physiological regulation in response to diverse stimuli. Here, we developed an interface between genome editing and proteomics to isolate native protein complexes produced from their natural genomic contexts. We used CRISPR/Cas9 and TAL effector nucleases (TALENs) to tag endogenous genes and purified several DNA repair and chromatin-modifying holoenzymes to near homogeneity. We uncovered subunits and interactions among well-characterized complexes and report the isolation of MCM8/9, highlighting the efficiency and robustness of the approach. These methods improve and simplify both small- and large-scale explorations of protein interactions as well as the study of biochemical activities and structure-function relationships.

  19. Growing Neural Gas approach for obtaining homogeneous maps by restricting the insertion of new nodes.

    PubMed

    Quintana-Pacheco, Yuri; Ruiz-Fernández, Daniel; Magrans-Rico, Agustín

    2014-06-01

    The Growing Neural Gas model is used widely in artificial neural networks. However, its application is limited in some contexts by the proliferation of nodes in dense areas of the input space. In this study, we introduce some modifications to address this problem by imposing three restrictions on the insertion of new nodes. Each restriction aims to maintain the homogeneous values of selected criteria. One criterion is related to the square error of classification and an alternative approach is proposed for avoiding additional computational costs. Three parameters are added that allow the regulation of the restriction criteria. The resulting algorithm allows models to be obtained that suit specific needs by specifying meaningful parameters. PMID:24675222

  20. Integrating mapping-, assembly- and haplotype-based approaches for calling variants in clinical sequencing applications.

    PubMed

    Rimmer, Andy; Phan, Hang; Mathieson, Iain; Iqbal, Zamin; Twigg, Stephen R F; Wilkie, Andrew O M; McVean, Gil; Lunter, Gerton

    2014-08-01

    High-throughput DNA sequencing technology has transformed genetic research and is starting to make an impact on clinical practice. However, analyzing high-throughput sequencing data remains challenging, particularly in clinical settings where accuracy and turnaround times are critical. We present a new approach to this problem, implemented in a software package called Platypus. Platypus achieves high sensitivity and specificity for SNPs, indels and complex polymorphisms by using local de novo assembly to generate candidate variants, followed by local realignment and probabilistic haplotype estimation. It is an order of magnitude faster than existing tools and generates calls from raw aligned read data without preprocessing. We demonstrate the performance of Platypus in clinically relevant experimental designs by comparing with SAMtools and GATK on whole-genome and exome-capture data, by identifying de novo variation in 15 parent-offspring trios with high sensitivity and specificity, and by estimating human leukocyte antigen genotypes directly from variant calls. PMID:25017105