Science.gov

Sample records for boson mapping approach

  1. A general approach to bosonization

    NASA Astrophysics Data System (ADS)

    Setlur, Girish S.; Meera, V.

    2007-10-01

    We summarize recent developments in the field of higher dimensional bosonization made by Setlur and collaborators and propose a general formula for the field operator in terms of currents and densities in one dimension using a new ingredient known as a `singular complex number'. Using this formalism, we compute the Green function of the homogeneous electron gas in one spatial dimension with short-range interaction leading to the Luttinger liquid and also with long-range interactions that lead to a Wigner crystal whose momentum distribution computed recently exhibits essential singularities. We generalize the formalism to finite temperature by combining with the author's hydrodynamic approach. The one-particle Green function of this system with essential singularities cannot be easily computed using the traditional approach to bosonization which involves the introduction of momentum cutoffs, hence the more general approach of the present formalism is proposed as a suitable alternative.

  2. Schematic microscopic approach to the description of M1 transitions between mixed-symmetry and fully symmetric collective states in {gamma}-soft nuclei based on RPA-IBM boson mapping

    SciTech Connect

    Jolos, R. V.; Shirikova, N. Yu.; Voronov, V. V.; Pietralla, N.

    2011-07-15

    A schematic microscopic method is developed to calculate the M1 transition probabilities between the mixed-symmetry and the fully symmetric states in {gamma}-soft nuclei. The method is based on the random-phase approximation-interacting boson model (RPA-IBM) boson mapping of the most collective isoscalar boson. All other boson modes with higher excitation energies, including the mixed-symmetry boson, are described in the framework of RPA. As an example the M1 transition probabilities are calculated for the {sup 124-134}Xe isotopes and compared with the experimental data. The results agree well with the data for the ratio B(M1;1{sub ms}{sup +}{yields}2{sub 2}{sup +})/B(M1;1{sub ms}{sup +}{yields}0{sub 1}{sup +}). However, the calculated ratio B(M1;2{sub ms}{sup +}{yields}2{sub 1}{sup +})/B(M1;1{sub ms}{sup +}{yields}0{sub 1}{sup +}) shows a significantly weaker dependence on the mass number than the experimental data.

  3. Similarity-transformed dyson mapping and SDG-interacting boson hamiltonian

    NASA Astrophysics Data System (ADS)

    Navrátil, P.; Dobeš, J.

    1991-10-01

    The sdg-interacting boson hamiltonian is constructed from the fermion shell-model input. The seniority boson mapping as given by the similarity-transformed Dyson boson mapping is used. The s, d, and g collective boson amplitudes are determined consistently from the mapped hamiltonian. Influence of the starting shell-model parameters is discussed. Calculations for the Sm isotopic chain and for the 148Sm, 150Nd, and 196Pt nuclei are presented. Calculated energy levels as well as E2 and E4 properties agree rather well with experimental ones. To obtain such agreement, the input shell-model parameters cannot be fixed at a constant set for several nuclei but have to be somewhat varied, especially in the deformed region. Possible reasons for this variation are discussed. Effects of the explicit g-boson consideration are shown.

  4. Bosonization approach for "atomic collapse" in graphene

    NASA Astrophysics Data System (ADS)

    Kagimura, Aya; Onogi, Tetsuya

    2016-02-01

    We study quantum electrodynamics with 2+1 dimensional massless Dirac fermion around a Coulomb impurity. Around a large charge with atomic number Z > 137, the QED vacuum is expected to collapse due to the strong Coulombic force. While the relativistic quantum mechanics fails to make reliable predictions for the fate of the vacuum, the heavy ion collision experiment also does not give clear understanding of this system. Recently, the "atomic collapse" resonances were observed on graphene where an artificial nuclei can be made. In this paper, we present our nonperturbative study of the vacuum structure of the quasiparticles in graphene with a charge impurity which contains multi-body effect using bosonization method.

  5. TFD Approach to Bosonic Strings and Dp-Branes

    NASA Astrophysics Data System (ADS)

    Abdalla, M. C. B.; Gadelha, A. L.; Vancea, I. V.

    In this work we explain the construction of the thermal vacuum for the bosonic string, as well that of the thermal boundary state interpreted as a Dp-brane at finite temperature. In both case we calculate the respective entropy using the entropy operator of the Thermo Field Dynamics theory. We show that the contribution of the thermal string entropy is explicitly present in the Dp-brane entropy. Furthermore, we show that the Thermo Field approach is suitable to introduce temperature in boundary states.

  6. Bosonic Dp-branes at finite temperature in TFD approach

    NASA Astrophysics Data System (ADS)

    Abdalla, M. C. B.; Gadelha, A. L.; Vancea, I. V.

    2004-02-01

    A general formulation of Thermo Field Dynamics using transformation generators that form the SU(1, 1) group, is presented and applied to the closed bosonic string and for bosonic Dp-brane with an external field.

  7. Self-consistent Hartree-Fock approach for interacting bosons in optical lattices

    NASA Astrophysics Data System (ADS)

    Lü, Qin-Qin; Patton, Kelly R.; Sheehy, Daniel E.

    2014-12-01

    A theoretical study of interacting bosons in a periodic optical lattice is presented. Instead of the commonly used tight-binding approach (applicable near the Mott-insulating regime of the phase diagram), the present work starts from the exact single-particle states of bosons in a cubic optical lattice, satisfying the Mathieu equation, an approach that can be particularly useful at large boson fillings. The effects of short-range interactions are incorporated using a self-consistent Hartree-Fock approximation, and predictions for experimental observables such as the superfluid transition temperature, condensate fraction, and boson momentum distribution are presented.

  8. Map Projections: Approaches and Themes

    ERIC Educational Resources Information Center

    Steward, H. J.

    1970-01-01

    Map projections take on new meaning with location systems needed for satellites, other planets and space. A classroom approach deals first with the relationship between the earth and the globe, then with transformations to flat maps. Problems of preserving geometric qualities: distance, angles, directions are dealt with in some detail as are…

  9. Boson mapping techniques applied to constant gauge fields in QCD

    NASA Technical Reports Server (NTRS)

    Hess, Peter Otto; Lopez, J. C.

    1995-01-01

    Pairs of coordinates and derivatives of the constant gluon modes are mapped to new gluon-pair fields and their derivatives. Applying this mapping to the Hamiltonian of constant gluon fields results for large coupling constants into an effective Hamiltonian which separates into one describing a scalar field and another one for a field with spin two. The ground state is dominated by pairs of gluons coupled to color and spin zero with slight admixtures of color zero and spin two pairs. As color group we used SU(2).

  10. Mean-field plus various types of pairing models and an exact boson mapping of the standard pairing model

    SciTech Connect

    Pan Feng; Wang Yin; Guan Xin; Jia Lu; Chen Xiangrong; Draayer, J. P.

    2011-06-28

    Exact solutions of Nilsson mean-field with various pairing interactions are reviewed. Some even-odd mass differences and moments of inertia of low-lying states for rare earth and actinide nuclei are calculated for the nearest-orbit pairing approximation as well as for the extended pairing model and compared to available experimental data. An exact boson mapping of the standard pairing Hamiltonian is also reported. Under the mapping, fermion pair operators are mapped exactly onto corresponding bosons. The image of the mapping is a Bose-Hubbard model with orbit-dependent hopping.

  11. bosons production in the quantum statistical parton distributions approach

    NASA Astrophysics Data System (ADS)

    Bourrely, Claude; Buccella, Franco; Soffer, Jacques

    2013-10-01

    We consider W± gauge bosons production in connection with recent results from BNL-RHIC and FNAL-Tevatron and interesting predictions from the statistical parton distributions. They concern relevant aspects of the structure of the nucleon sea and the high-x region of the valence quark distributions. We also give predictions in view of future proton-neutron collisions experiments at BNL-RHIC.

  12. Nonunitary and unitary approach to Eigenvalue problem of Boson operators and squeezed coherent states

    NASA Technical Reports Server (NTRS)

    Wunsche, A.

    1993-01-01

    The eigenvalue problem of the operator a + zeta(boson creation operator) is solved for arbitrarily complex zeta by applying a nonunitary operator to the vacuum state. This nonunitary approach is compared with the unitary approach leading for the absolute value of zeta less than 1 to squeezed coherent states.

  13. Coherent state approach to the interacting boson model: Test of its validity in the transitional region

    SciTech Connect

    Inci, I.; Alonso, C. E.; Arias, J. M.; Fortunato, L.; Vitturi, A.

    2009-09-15

    The predictive power of the coherent state (CS) approach to the interacting boson model (IBM) is tested far from the IBM dynamical symmetry limits. The transitional region along the {gamma}-unstable path from U(5) to O(6) is considered. Excitation energy of the excited {beta} band and intraband and interband transitions obtained within the CS approach are compared with the exact results as a function of the boson number N. We find that the CS formalism provides approximations to the exact results that are correct up to the order 1/N in the transitional region, except in a narrow region close to the critical point.

  14. Mapping between the classical and pseudoclassical models of a relativistic spinning particle in external bosonic and fermionic fields. I

    NASA Astrophysics Data System (ADS)

    Markov, Yu. A.; Markova, M. A.

    2015-06-01

    The problem on mapping between two Lagrangian descriptions (using a commuting c-number spinor ψα or anticommuting pseudovector ξμ and pseudoscalar ξ5 variables) of the spin degrees of freedom of a color spinning massive particle interacting with background non-Abelian gauge field, is considered. A general analysis of the mapping between a pair of Majorana spinors (ψα, θα) (θα is some auxiliary anticommuting spinor) and a real anticommuting tensor aggregate (S, Vμ, T*μν, Aμ, P), is presented. A complete system of bilinear relations between the tensor quantities, is obtained. The analysis we have given is used for the above problem of the equivalence of two different ways of describing the spin degrees of freedom of the relativistic particle. The mapping of the kinetic term (iħ / 2) (θ bar θ) (ψ bar ˙ ψ - ψ bar ψ ˙), the term (1 / e) (θ bar θ)x˙μ (ψ bar γμ ψ) that provides a couple of the spinning variable ψ and the particle velocity x˙μ, and the interaction term ħ (θ bar θ)Qa Fμνa (ψ bar σμν ψ) with an external non-Abelian gauge field, are considered in detail. In the former case a corresponding system of bilinear identities including both the tensor variables and their derivatives (S ˙, V˙μ, ˙ μν *T, A˙μ, P ˙), is defined. A detailed analysis of the local bosonic symmetry of the Lagrangian with the commuting spinor ψα, is carried out. A connection of this symmetry with the local SUSY transformation of the Lagrangian containing anticommuting pseudovector and pseudoscalar variables, is considered. The approach of obtaining a supersymmetric Lagrangian in terms of the even ψα and odd θα spinors, is offered.

  15. Usage-Oriented Topic Maps Building Approach

    NASA Astrophysics Data System (ADS)

    Ellouze, Nebrasse; Lammari, Nadira; Métais, Elisabeth; Ben Ahmed, Mohamed

    In this paper, we present a collaborative and incremental construction approach of multilingual Topic Maps based on enrichment and merging techniques. In recent years, several Topic Map building approaches have been proposed endowed with different characteristics. Generally, they are dedicated to particular data types like text, semi-structured data, relational data, etc. We note also that most of these approaches take as input monolingual documents to build the Topic Map. The problem is that the large majority of resources available today are written in various languages, and these resources could be relevant even to non-native speakers. Thus, our work is driven towards a collaborative and incremental method for Topic Map construction from textual documents available in different languages. To enrich the Topic Map, we take as input a domain thesaurus and we propose also to explore the Topic Map usage which means available potential questions related to the source documents.

  16. Variational cluster approach for strongly correlated lattice bosons in the superfluid phase

    SciTech Connect

    Knap, Michael; Arrigoni, Enrico; Linden, Wolfgang von der

    2011-04-01

    We extend the variational cluster approach to deal with strongly correlated lattice bosons in the superfluid phase. To this end, we reformulate the approach within a pseudoparticle formalism, whereby cluster excitations are described by particlelike excitations. The approximation amounts to solving a multicomponent noninteracting bosonic system by means of a multimode Bogoliubov approximation. A source-and-drain term is introduced in order to break U(1) symmetry at the cluster level. We provide an expression for the grand potential, the single-particle normal and anomalous Green's functions, the condensate density, and other static quantities. As a first nontrivial application of the method we choose the two-dimensional Bose-Hubbard model and evaluate results in both the Mott and the superfluid phases. Our results show an excellent agreement with quantum Monte Carlo calculations.

  17. A Tangible Approach to Concept Mapping

    NASA Astrophysics Data System (ADS)

    Tanenbaum, Karen; Antle, Alissa N.

    2009-05-01

    The Tangible Concept Mapping project investigates using a tangible user interface to engage learners in concept map creation. This paper describes a prototype implementation of the system, presents some preliminary analysis of its ease of use and effectiveness, and discusses how elements of tangible interaction support concept mapping by helping users organize and structure their knowledge about a domain. The role of physical engagement and embodiment in supporting the mental activity of creating the concept map is explored as one of the benefits of a tangible approach to learning.

  18. The Higgs boson masses and mixings of the complex MSSM in the Feynman-diagrammatic approach

    NASA Astrophysics Data System (ADS)

    Frank, Meikel; Hahn, Thomas; Heinemeyer, Sven; Hollik, Wolfgang; Rzehak, Heidi; Weiglein, Georg

    2007-02-01

    New results for the complete one-loop contributions to the masses and mixing effects in the Higgs sector are obtained for the MSSM with complex parameters using the Feynman-diagrammatic approach. The full dependence on all relevant complex phases is taken into account, and all the imaginary parts appearing in the calculation are treated in a consistent way. The renormalization is discussed in detail, and a hybrid on-shell/bar Dbar R scheme is adopted. We also derive the wave function normalization factors needed in processes with external Higgs bosons and discuss effective couplings incorporating leading higher-order effects. The complete one-loop corrections, supplemented by the available two-loop corrections in the Feynman-diagrammatic approach for the MSSM with real parameters and a resummation of the leading (s)bottom corrections for complex parameters, are implemented into the public Fortran code FeynHiggs 2.5. In our numerical analysis the full results for the Higgs-boson masses and couplings are compared with various approximations, and Script CScript P-violating effects in the mixing of the heavy Higgs bosons are analyzed in detail. We find sizable deviations in comparison with the approximations often made in the literature.

  19. Reprint of : Scattering theory approach to bosonization of non-equilibrium mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Sukhorukov, Eugene V.

    2016-08-01

    Between many prominent contributions of Markus Büttiker to mesoscopic physics, the scattering theory approach to the electron transport and noise stands out for its elegance, simplicity, universality, and popularity between theorists working in this field. It offers an efficient way to theoretically investigate open electron systems far from equilibrium. However, this method is limited to situations where interactions between electrons can be ignored, or considered perturbatively. Fortunately, this is the case in a broad class of metallic systems, which are commonly described by the Fermi liquid theory. Yet, there exist another broad class of electron systems of reduced dimensionality, the so-called Tomonaga-Luttinger liquids, where interactions are effectively strong and cannot be neglected even at low energies. Nevertheless, strong interactions can be accounted exactly using the bosonization technique, which utilizes the free-bosonic character of collective excitations in these systems. In the present work, we use this fact in order to develop the scattering theory approach to the bosonization of open quasi-one dimensional electron systems far from equilibrium.

  20. A new approach to shortest paths on networks based on the quantum bosonic mechanism

    NASA Astrophysics Data System (ADS)

    Jiang, Xin; Wang, Hailong; Tang, Shaoting; Ma, Lili; Zhang, Zhanli; Zheng, Zhiming

    2011-01-01

    This paper presents quantum bosonic shortest path searching (QBSPS), a natural, practical and highly heuristic physical algorithm for reasoning about the recognition of network structure via quantum dynamics. QBSPS is based on an Anderson-like itinerant bosonic system in which a boson's Green function is used as a navigation pointer for one to accurately approach the terminals. QBSPS is demonstrated by rigorous mathematical and physical proofs and plenty of simulations, showing how it can be used as a greedy routing to seek the shortest path between different locations. In methodology, it is an interesting and new algorithm rooted in the quantum mechanism other than combinatorics. In practice, for the all-pairs shortest-path problem in a random scale-free network with N vertices, QBSPS runs in O(μ(N) ln ln N) time. In application, we suggest that the corresponding experimental realizations are feasible by considering path searching in quantum optical communication networks; in this situation, the method performs a pure local search on networks without requiring the global structure that is necessary for current graph algorithms.

  1. Non-equilibrium slave bosons approach to quantum pumping in interacting quantum dots

    NASA Astrophysics Data System (ADS)

    Citro, Roberta; Romeo, Francesco

    2016-03-01

    We review a time-dependent slave bosons approach within the non-equilibrium Green's function technique to analyze the charge and spin pumping in a strongly interacting quantum dot. We study the pumped current as a function of the pumping phase and of the dot energy level and show that a parasitic current arises, beyond the pure pumping one, as an effect of the dynamical constraints. We finally illustrate an all-electrical mean for spin-pumping and discuss its relevance for spintronics applications.

  2. Double occupancy in dynamical mean-field theory and the dual boson approach

    NASA Astrophysics Data System (ADS)

    van Loon, Erik G. C. P.; Krien, Friedrich; Hafermann, Hartmut; Stepanov, Evgeny A.; Lichtenstein, Alexander I.; Katsnelson, Mikhail I.

    2016-04-01

    We discuss the calculation of the double occupancy using dynamical mean-field theory in finite dimensions. The double occupancy can be determined from the susceptibility of the auxiliary impurity model or from the lattice susceptibility. The former method typically overestimates, whereas the latter underestimates the double occupancy. We illustrate this for the square-lattice Hubbard model. We propose an approach for which both methods lead to identical results by construction and which resolves this ambiguity. This self-consistent dual boson scheme results in a double occupancy that is numerically close to benchmarks available in the literature.

  3. A Statistical Approach for Ambiguous Sequence Mappings

    Technology Transfer Automated Retrieval System (TEKTRAN)

    When attempting to map RNA sequences to a reference genome, high percentages of short sequence reads are often assigned to multiple genomic locations. One approach to handling these “ambiguous mappings” has been to discard them. This results in a loss of data, which can sometimes be as much as 45% o...

  4. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  5. Analytical approach to a bosonic ladder subject to a magnetic field

    NASA Astrophysics Data System (ADS)

    Uchino, Shun

    2016-05-01

    We examine a bosonic two-leg ladder model subject to a magnetic flux and especially focus on a regime where the lower-energy band has two minima. By using a low-energy field theory approach, we study several issues discussed in the system: the existence of local patterns in density and current, chiral-current reversal, and the effect of a nearest-neighbor interaction along the rung direction. In our formalism, the local patterns are interpreted as a result of breaking of discrete symmetry. The chiral-current reversal occurs through a competition between a current component determined at a commensurate vortex density causing an enlargement of the unit cell and another component, which is proportional to the magnetic-field doping from the corresponding commensurate flux. The nearest-neighbor interaction along the rung direction available with the technique on a synthetic dimension is shown to favor a population-imbalance solution in an experimentally relevant regime.

  6. Microscopic calculation of interacting boson model parameters by potential-energy surface mapping

    SciTech Connect

    Bentley, I.; Frauendorf, S.

    2011-06-15

    A coherent state technique is used to generate an interacting boson model (IBM) Hamiltonian energy surface which is adjusted to match a mean-field energy surface. This technique allows the calculation of IBM Hamiltonian parameters, prediction of properties of low-lying collective states, as well as the generation of probability distributions of various shapes in the ground state of transitional nuclei, the last two of which are of astrophysical interest. The results for krypton, molybdenum, palladium, cadmium, gadolinium, dysprosium, and erbium nuclei are compared with experiment.

  7. Constructing Linkage Disequilibrium Map with Iterative Approach

    NASA Astrophysics Data System (ADS)

    Ao, S. I.

    2008-05-01

    With recent advance of the genotyping single nucleotide polymorphisms (SNPs) in mass scale of high density in a candidate region of the human genome, the linkage disequilibrium analysis can offer a much higher resolution of the biological samples than the traditional linkage maps. We have formulated this LD mapping problem as a constrained unidimensional scaling problem. Our method, which is directly based on the measurement of LD among SNPs, is non-parametric. Therefore it is different from LD maps derived from the given Malecot model. We have formulated with the quadratic programming approach for solving this constrained unidimensional scaling problem. Different from the classical metric unidimensional scaling problem, the constrained problem is not an NP-hard combinatorial problem. The optimal solution is determined by using the quadratic programming solver. Nevertheless, because of the large requirement for memory during the running time that may cause the out of memory problems, and the high computational time of the quadratic programming algorithm, the iterative algorithm has been developed for solving this LD constrained unidimensional scaling problem.

  8. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  9. Hydrochromic Approaches to Mapping Human Sweat Pores.

    PubMed

    Park, Dong-Hoon; Park, Bum Jun; Kim, Jong-Man

    2016-06-21

    colorimetric change near body temperature. This feature enables the use of this technique to generate high-quality images of sweat pores. This Account also focuses on the results of the most recent phase of this investigation, which led to the development of a simple yet efficient and reliable technique for sweat pore mapping. The method utilizes a hydrophilic polymer composite film containing fluorescein, a commercially available dye that undergoes a fluorometric response as a result of water-dependent interconversion between its ring-closed spirolactone (nonfluorescent) and ring-opened fluorone (fluorescent) forms. Surface-modified carbon nanodots (CDs) have also been found to be efficient for hydrochromic mapping of human sweat pores. The results discovered by Lou et al. [ Adv. Mater. 2015 , 27 , 1389 ] are also included in this Account. Sweat pore maps obtained from fingertips using these materials were found to be useful for fingerprint analysis. In addition, this hydrochromism-based approach is sufficiently sensitive to enable differentiation between sweat-secreting active pores and inactive pores. As a result, the techniques can be applied to clinical diagnosis of malfunctioning sweat pores. The directions that future research in this area will follow are also discussed. PMID:27159417

  10. Self-consistent dual boson approach to single-particle and collective excitations in correlated systems

    NASA Astrophysics Data System (ADS)

    Stepanov, E. A.; van Loon, E. G. C. P.; Katanin, A. A.; Lichtenstein, A. I.; Katsnelson, M. I.; Rubtsov, A. N.

    2016-01-01

    We propose an efficient dual boson scheme, which extends the dynamical mean-field theory paradigm to collective excitations in correlated systems. The theory is fully self-consistent both on the one- and on the two-particle level, thus describing the formation of collective modes as well as the renormalization of electronic and bosonic spectra on equal footing. The method employs an effective impurity model comprising both fermionic and bosonic hybridization functions. Only single- and two-electron Green's functions of the reference problem enter the theory, due to the optimal choice of the self-consistency condition for the effective bosonic bath. We show that the theory is naturally described by a dual Luttinger-Ward functional and obeys the relevant conservation laws.

  11. Learning topological maps: An alternative approach

    SciTech Connect

    Buecken, A.; Thrun, S.

    1996-12-31

    Our goal is autonomous real-time control of a mobile robot. In this paper we want to show a possibility to learn topological maps of a large-scale indoor environment autonomously. In the literature there are two paradigms how to store information on the environment of a robot: as a grid-based (geometric) or as a topological map. While grid-based maps are considerably easy to learn and maintain, topological maps are quite compact and facilitate fast motion-planning.

  12. One-loop perturbative unitarity and the Higgs-boson mass: A new approach

    NASA Astrophysics Data System (ADS)

    Durand, Loyal; Johnson, James M.; Lopez, Jorge L.

    1990-03-01

    We reexamine the unitarity constraints on the high-energy scattering of longitudinal W 's and Z's and Higgs bosons in the standard model including one-loop corrections, and make an Argand-diagram analysis of the j=0 scattering amplitudes. We find that the theory is approximately unitary and weakly interacting at O(λ2) for Higgs-boson couplings λ<λc=1.5-2 (equivalent to MH<350-400 GeV), but that O(λ3) or higher corrections must be included to restore perturbative unitarity for larger values of λ or MH.

  13. Boson core compressibility

    NASA Astrophysics Data System (ADS)

    Khorramzadeh, Y.; Lin, Fei; Scarola, V. W.

    2012-04-01

    Strongly interacting atoms trapped in optical lattices can be used to explore phase diagrams of Hubbard models. Spatial inhomogeneity due to trapping typically obscures distinguishing observables. We propose that measures using boson double occupancy avoid trapping effects to reveal two key correlation functions. We define a boson core compressibility and core superfluid stiffness in terms of double occupancy. We use quantum Monte Carlo on the Bose-Hubbard model to empirically show that these quantities intrinsically eliminate edge effects to reveal correlations near the trap center. The boson core compressibility offers a generally applicable tool that can be used to experimentally map out phase transitions between compressible and incompressible states.

  14. Julia-Toulouse approach to (d+1)-dimensional bosonized Schwinger model with an application to large N QCD

    NASA Astrophysics Data System (ADS)

    Guimaraes, M. S.; Rougemont, R.; Wotzasek, C.; Zarro, C. A. D.

    2012-12-01

    The Julia-Toulouse approach for condensation of charges and defects is used to show that the bosonized Schwinger model can be obtained through a condensation of electric charges in 1+1 dimensions. The massive model is derived by taking into account the presence of vortices over the electric condensate, while the massless model is obtained when these vortices are absent. This construction is then straightforwardly generalized for arbitrary d+1 spacetime dimensions. The d=3 case corresponds to the large N chiral dynamics of SU(N) QCD in the limit N→∞.

  15. Quantitative Genetic Interaction Mapping Using the E-MAP Approach

    PubMed Central

    Collins, Sean R.; Roguev, Assen; Krogan, Nevan J.

    2010-01-01

    Genetic interactions represent the degree to which the presence of one mutation modulates the phenotype of a second mutation. In recent years, approaches for measuring genetic interactions systematically and quantitatively have proven to be effective tools for unbiased characterization of gene function and have provided valuable data for analyses of evolution. Here, we present protocols for systematic measurement of genetic interactions with respect to organismal growth rate for two yeast species. PMID:20946812

  16. A Canonical Ensemble Approach to the Fermion/Boson Random Point Processes and Its Applications

    NASA Astrophysics Data System (ADS)

    Tamura, H.; Ito, K. R.

    2006-04-01

    We introduce the boson and the fermion point processes from the elementary quantum mechanical point of view. That is, we consider quantum statistical mechanics of the canonical ensemble for a fixed number of particles which obey Bose-Einstein, Fermi-Dirac statistics, respectively, in a finite volume. Focusing on the distribution of positions of the particles, we have point processes of the fixed number of points in a bounded domain. By taking the thermodynamic limit such that the particle density converges to a finite value, the boson/fermion processes are obtained. This argument is a realization of the equivalence of ensembles, since resulting processes are considered to describe a grand canonical ensemble of points. Random point processes corresponding to para-particles of order two are discussed as an application of the formulation. Statistics of a system of composite particles at zero temperature are also considered as a model of determinantal random point processes.

  17. Energy fluctuation of a finite number of interacting bosons: A correlated many-body approach

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Satadal; Lekala, M. L.; Chakrabarti, Barnali; Rampho, G. J.

    2016-03-01

    We calculate the energy fluctuation of a truly finite number of interacting bosons and study the role of interaction. Although the ideal Bose gas in thermodynamic limit is an exactly solvable problem and analytic expression of various fluctuation measures exists, the experimental Bose-Einstein condensation (BEC) is a nontrivial many-body problem. We employ a two-body correlated basis function and utilize the realistic van der Waals interaction. We calculate the energy fluctuation (△E2) of the interacting trapped bosons and plot △E/2 kB2T2 as a function of T/Tc. In the classical limit △E2 is related to the specific heat per particle cv through the relation △E2=kBT2cv . We have obtained a distinct hump in △E/2 kB2T2 around the condensation point for three-dimesional harmonically trapped Bose gas when the particle number N ≃5000 and above which corresponds to the second-order phase transition. However for finite-size interacting bosons (N ≃ a few hundred) the hump is not sharp, and the maximum in △E/2 kB2T2 can be interpreted as a smooth increase in the scaled fluctuation below Tc and then a decrease above Tc. To illustrate the justification we also calculate cv, which exhibits the same feature, which leads to the conjecture that for finite-sized interacting bosons phase transition is ruled out.

  18. An integrative approach to genomic introgression mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Near-Isogenic Lines (NILs) are valuable genetic resources for many crop species, including soybean. The development of new molecular platforms promises to accelerate the mapping of genetic introgressions in these materials. Here we compare some existing and emerging methodologies for genetic intro...

  19. FEASIBILITY AND APPROACH FOR MAPPING RADON POTENTIALS IN FLORIDA

    EPA Science Inventory

    The report gives results of an analysis of the feasibility and approach for developing statewide maps of radon potentials in Florida. he maps would provide a geographic basis for implementing new radon-protective building construction standards to reduce public health risks from ...

  20. ModMAP: A Systematic Approach to Individualized Teacher Education

    ERIC Educational Resources Information Center

    Kranyik, Robert D.; Kielty, Joseph W.

    1974-01-01

    A description of a competency based, individualized graduate degree program, Modular Multiple Alternative Program (ModMAP). The program focuses on the training of elementary teachers, and offers an alternative approach to graduate studies. (Author)

  1. Recent developments in MAP - MODULAR APPROACH to PHYSICS

    NASA Astrophysics Data System (ADS)

    Rae, Jennifer; Austen, Dave; Brouwer, Wytze

    2002-05-01

    We present recent developments in MAP - MODULAR APPROACH to PHYSICS - JAVA enhanced modules to be used as aids in teaching the first 3 terms of university physics. The MAP project is very comprehensive and consists of a modular approach to physics that utilizes JAVA applets, FLASH animations and HTML based tutorials. The overall instructional philosophy of MAP is constructivist and the project emphasizes active learner participation. In this talk we will provide a quick overview of the project and the results of recent pilot testing at several Canadian universities. It will also include a discussion of the VIDEO LAB aspect of MAP. This is a component that is integrated into MAP and permits students to capture and evaluate otherwise difficult to study phenomena on video.

  2. Improving long time behavior of Poisson bracket mapping equation: A non-Hamiltonian approach

    SciTech Connect

    Kim, Hyun Woo; Rhee, Young Min

    2014-05-14

    Understanding nonadiabatic dynamics in complex systems is a challenging subject. A series of semiclassical approaches have been proposed to tackle the problem in various settings. The Poisson bracket mapping equation (PBME) utilizes a partial Wigner transform and a mapping representation for its formulation, and has been developed to describe nonadiabatic processes in an efficient manner. Operationally, it is expressed as a set of Hamilton's equations of motion, similar to more conventional classical molecular dynamics. However, this original Hamiltonian PBME sometimes suffers from a large deviation in accuracy especially in the long time limit. Here, we propose a non-Hamiltonian variant of PBME to improve its behavior especially in that limit. As a benchmark, we simulate spin-boson and photosynthetic model systems and find that it consistently outperforms the original PBME and its Ehrenfest style variant. We explain the source of this improvement by decomposing the components of the mapping Hamiltonian and by assessing the energy flow between the system and the bath. We discuss strengths and weaknesses of our scheme with a viewpoint of offering future prospects.

  3. Multilocal bosonization

    NASA Astrophysics Data System (ADS)

    Anguelova, Iana I.

    2015-12-01

    We present a bilocal isomorphism between the algebra generated by a single real twisted boson field and the algebra of the boson βγ ghost system. As a consequence of this twisted vertex algebra isomorphism, we show that each of these two algebras possesses both untwisted and twisted Heisenberg bosonic currents, as well as three separate families of Virasoro fields. We show that this bilocal isomorphism generalizes to an isomorphism between the algebra generated by the twisted boson field with 2n points of localization and the algebra of the 2n symplectic bosons.

  4. Combinatorial approach to generalized Bell and Stirling numbers and boson normal ordering problem

    SciTech Connect

    Mendez, M.A.; Blasiak, P.; Penson, K.A.

    2005-08-01

    We consider the numbers arising in the problem of normal ordering of expressions in boson creation a{sup {dagger}} and annihilation a operators ([a,a{sup {dagger}}]=1). We treat a general form of a boson string (a{sup {dagger}}){sup r{sub n}}a{sup s{sub n}}...(a{sup {dagger}}){sup r{sub 2}}a{sup s{sub 2}}(a{sup {dagger}}){sup r{sub 1}}a{sup s{sub 1}} which is shown to be associated with generalizations of Stirling and Bell numbers. The recurrence relations and closed-form expressions (Dobinski-type formulas) are obtained for these quantities by both algebraic and combinatorial methods. By extensive use of methods of combinatorial analysis we prove the equivalence of the aforementioned problem to the enumeration of special families of graphs. This link provides a combinatorial interpretation of the numbers arising in this normal ordering problem.

  5. Two-fluid behavior of the Kondo lattice in the 1/N slave boson approach

    NASA Astrophysics Data System (ADS)

    Barzykin, Victor

    2006-03-01

    It has been recently shown by Nakatsuji, Pines, and Fisk [S. Nakatsuji, D. Pines, and Z. Fisk, Phys. Rev. Lett. 92, 016401 (2004)] from the phenomenological analysis of experiments in Ce1-xLaxCoIn5 and CeIrIn5 that thermodynamic and transport properties of Kondo lattices below coherence temperature can be very successfully described in terms of a two-fluid model, with Kondo impurity and heavy electron Fermi liquid contributions. We analyze thermodynamic properties of Kondo lattices using 1/N slave boson treatment of the periodic Anderson model and show that these two contributions indeed arise below the coherence temperature. We find that the Kondo impurity contribution to thermodynamics corresponds to thermal excitations into the flat portion of the energy spectrum.

  6. Exact results in a slave boson saddle point approach for a strongly correlated electron model

    SciTech Connect

    Fresard, Raymond; Kopp, Thilo

    2008-08-15

    We revisit the Kotliar-Ruckenstein (KR) slave boson saddle point evaluation for a two-site correlated electron model. As the model can be solved analytically, it is possible to compare the KR saddle point results with the exact many-particle levels. The considered two-site cluster mimics an infinite-U single-impurity Anderson model with a nearest-neighbor Coulomb interaction: one site is strongly correlated with an infinite local Coulomb repulsion, which hybridizes with the second site, on which the local Coulomb repulsion vanishes. Making use of the flexibility of the representation, we introduce appropriate weight factors in the KR saddle point scheme. Ground-state and all excitation levels agree with the exact diagonalization results. Thermodynamics and correlation functions may be recovered in a suitably renormalized saddle point evaluation.

  7. An Incremental Map Building Approach via Static Stixel Integration

    NASA Astrophysics Data System (ADS)

    Muffert, M.; Anzt, S.; Franke, U.

    2013-10-01

    This paper presents a stereo-vision based incremental mapping approach for urban regions. As input, we use the 3D representation called multi-layered Stixel World which is computed from dense disparity images. More and more, researchers of Driver Assistance Systems rely on efficient and compact 3D representations like the Stixel World. The developed mapping approach takes into account the motion state of obstacles, as well as free space information obtained from the Stixel World. The presented work is based on the well known occupancy grid mapping technique and is formulated with evidential theory. A detailed sensor model is described which is used to determine the information whether a grid cell is occupied, free or has an unknown state. The map update is solved in a time recursive manner by using the Dempster`s Rule of Combination. 3D results of complex inner city regions are shown and are compared with Google Earth images.

  8. Look before you leap: a new approach to mapping QTL.

    PubMed

    Huang, B Emma; George, Andrew W

    2009-09-01

    In this paper, we present an innovative and powerful approach for mapping quantitative trait loci (QTL) in experimental populations. This deviates from the traditional approach of (composite) interval mapping which uses a QTL profile to simultaneously determine the number and location of QTL. Instead, we look before we leap by employing separate detection and localization stages. In the detection stage, we use an iterative variable selection process coupled with permutation to identify the number and synteny of QTL. In the localization stage, we position the detected QTL through a series of one-dimensional interval mapping scans. Results from a detailed simulation study and real analysis of wheat data are presented. We achieve impressive increases in the power of QTL detection compared to composite interval mapping. We also accurately estimate the size and position of QTL. An R library, DLMap, implements the methods described here and is freely available from CRAN ( http://cran.r-project.org/ ). PMID:19585099

  9. A Nonparametric Approach for Mapping Quantitative Trait Loci

    PubMed Central

    Kruglyak, L.; Lander, E. S.

    1995-01-01

    Genetic mapping of quantitative trait loci (QTLs) is performed typically by using a parametric approach, based on the assumption that the phenotype follows a normal distribution. Many traits of interest, however, are not normally distributed. In this paper, we present a nonparametric approach to QTL mapping applicable to any phenotypic distribution. The method is based on a statistic Z(w), which generalizes the nonparametric Wilcoxon rank-sum test to the situation of whole-genome search by interval mapping. We determine the appropriate significance level for the statistic Z(w), by showing that its asymptotic null distribution follows an Ornstein-Uhlenbeck process. These results provide a robust, distribution-free method for mapping QTLs. PMID:7768449

  10. Concept maps and nursing theory: a pedagogical approach.

    PubMed

    Hunter Revell, Susan M

    2012-01-01

    Faculty seek to teach nursing students how to link clinical and theoretical knowledge with the intent of improving patient outcomes. The author discusses an innovative 9-week concept mapping activity as a pedagogical approach to teach nursing theory in a graduate theory course. Weekly concept map building increased student engagement and fostered theoretical thinking. Unexpectedly, this activity also benefited students through group work and its ability to enhance theory-practice knowledge. PMID:22513774

  11. Tank Update System: A novel asset mapping approach for verifying and updating lakes using Google Maps

    NASA Astrophysics Data System (ADS)

    Reddy Pulsani, Bhaskar

    2016-06-01

    Mission Kakatiya is one of prestigious programs of Telangana state government under which restoration of tank across ten districts is being implemented. As part of the program, government plans to restore about 9,000 lakes. Therefore, to have a comprehensive list of lakes existing in Telangana state, Samagra Tank Survey was carried out. Data collected in this survey contained about 45,000 tanks. Since the mode of collection of data was not in a standard format and was made using excel, a web interface was created to fill the gaps and to standardise the data. A new approach for spatially identifying the lakes through Google maps was successfully implemented by developing a web interface. This approach is less common since it implements the nature of asset mapping for the lakes of Telangana state and shows the advantages of using online mapping applications such as Google maps in identifying and cross checking already existing lakes on it.

  12. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  13. Quasiparticle random-phase approximation and {beta}-decay physics: Higher-order approximations in a boson formalism

    SciTech Connect

    Sambataro, M.; Suhonen, J.

    1997-08-01

    The quasiparticle random-phase approximation (QRPA) is reviewed and higher-order approximations are discussed with reference to {beta}-decay physics. The approach is fully developed in a boson formalism. Working within a schematic model, we first illustrate a fermion-boson mapping procedure and apply it to construct boson images of the fermion Hamiltonian at different levels of approximation. The quality of these images is tested through a comparison between approximate and exact spectra. Standard QRPA equations are derived in correspondence with the quasi-boson limit of the first-order boson Hamiltonian. The use of higher-order Hamiltonians is seen to improve considerably the stability of the approximate solutions. The mapping procedure is also applied to Fermi {beta} operators: exact and approximate transition amplitudes are discussed together with the Ikeda sum rule. The range of applicabilty of the QRPA formalism is analyzed. {copyright} {ital 1997} {ital The American Physical Society}

  14. Dark matter coupling to electroweak gauge and Higgs bosons: An effective field theory approach

    NASA Astrophysics Data System (ADS)

    Chen, Jing-Yuan; Kolb, Edward W.; Wang, Lian-Tao

    2013-12-01

    If dark matter is a new species of particle produced in the early universe as a cold thermal relic (a weakly-interacting massive particle-WIMP), its present abundance, its scattering with matter in direct-detection experiments, its present-day annihilation signature in indirect-detection experiments, and its production and detection at colliders, depend crucially on the WIMP coupling to standard-model (SM) particles. It is usually assumed that the WIMP couples to the SM sector through its interactions with quarks and leptons. In this paper we explore the possibility that the WIMP coupling to the SM sector is via electroweak gauge and Higgs bosons. In the absence of an ultraviolet-complete particle-physics model, we employ effective field theory to describe the WIMP-SM coupling. We consider both scalars and Dirac fermions as possible dark-matter candidates. Starting with an exhaustive list of operators up to dimension 8, we present detailed calculation of dark-matter annihilations to all possible final states, including γγ, γZ, γh, ZZ, Zh, W+W-, hh, and ffbar, and demonstrate the correlations among them. We compute the mass scale of the effective field theory necessary to obtain the correct dark-matter mass density, and well as the resulting photon line signals.

  15. Dynamics of hadronic molecule in one-boson exchange approach and possible heavy flavor molecules

    SciTech Connect

    Ding Guijun; Liu Jiafeng; Yan Mulin

    2009-03-01

    We extend the one pion exchange model at quark level to include the short distance contributions coming from {eta}, {sigma}, {rho} and {omega} exchange. This formalism is applied to discuss the possible molecular states of DD*/DD*, BB*/BB*, DD*, BB*, the pseudoscalar-vector systems with C=B=1 and C=-B=1 respectively. The ''{delta} function'' term contribution and the S-D mixing effects have been taken into account. We find the conclusions reached after including the heavier mesons exchange are qualitatively the same as those in the one pion exchange model. The previous suggestion that 1{sup ++} BB*/BB* molecule should exist, is confirmed in the one-boson exchange model, whereas DD* bound state should not exist. The DD*/DD* system can accommodate a 1{sup ++} molecule close to the threshold, the mixing between the molecule and the conventional charmonium has to be considered to identify this state with X(3872). For the BB* system, the pseudoscalar-vector systems with C=B=1 and C=-B=1, near threshold molecular states may exist. These bound states should be rather narrow, isospin is violated and the I=0 component is dominant. Experimental search channels for these states are suggested.

  16. Hyperspherical approach to a three-boson problem in two dimensions with a magnetic field

    NASA Astrophysics Data System (ADS)

    Rittenhouse, Seth T.; Wray, Andrew; Johnson, B. L.

    2016-01-01

    We examine a system of three-bosons confined to two dimensions in the presence of a perpendicular magnetic field within the framework of the adiabatic hyperspherical method. For the case of zero-range, regularized pseudopotential interactions, we find that the system is nearly separable in hyperspherical coordinates and that, away from a set of narrow avoided crossings, the full energy eigenspectrum as a function of the two-dimensional (2D) s -wave scattering length is well described by ignoring coupling between adiabatic hyperradial potentials. In the case of weak attractive or repulsive interactions, we find the lowest three-body energy states exhibit even-odd parity oscillations as a function of total internal 2D angular momentum and that for weak repulsive interactions, the universal lowest energy interacting state has an internal angular momentum of M =3 . With the inclusion of repulsive higher angular momentum we surmise that the origin of a set of "magic number" states (states with anomalously low energy) might emerge as the result of a combination of even-odd parity oscillations and the pattern of degeneracy in the noninteracting lowest Landau level states.

  17. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge

    ERIC Educational Resources Information Center

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    Technology mapping[TM] is proposed as an approach for developing technological pedagogical content knowledge (TPCK). The study discusses in detail instructional design guidelines in relation to the enactment of TM, and reports on empirical findings from a study with 72 pre-service primary teachers within the context of teaching them how to teach…

  18. Mapping between the classical and pseudoclassical models of a relativistic spinning particle in external bosonic and fermionic fields. II

    NASA Astrophysics Data System (ADS)

    Markov, Yu. A.; Markova, M. A.

    2016-06-01

    The exact solution of a system of bilinear identities derived in the first part of our work [1] for the case of real Grassmann-odd tensor aggregate of the type (S ,Vμ ,*Tμν ,Aμ , P) is obtained. The consistency of the solution with a corresponding system of bilinear identities including both the tensor variables and their derivatives (S ˙ ,V˙μ ,*T˙μν ,A˙μ , P ˙) is considered. The alternative approach in solving of the algebraic system based on introducing complex tensor quantities is discussed. This solution is used in constructing the mapping of the interaction terms of spinning particle with a background (Majorana) fermion field ΨMαi (x). A way of the extension of the obtained results for the case of the Dirac spinors (ψDα ,θDα) and a background Dirac field ΨDαi (x), is suggested. It is shown that for the construction of one-to-one correspondence between the most general spinors and the tensor variables, we need a four-fold increase of the number of the tensor ones. A connection with the higher-order derivative Lagrangians for a point particle and in particular, with the Lagrangian suggested by A.M. Polyakov, is proposed.

  19. One-Boson Approach to Dilepton Production in Nucleon-Nucleon Collisions.

    NASA Astrophysics Data System (ADS)

    Haglin, Kevin Lee

    1990-01-01

    We calculate energy dependent nucleon-nucleon total elastic cross sections and invariant mass dependent electron-positron pair production differential cross sections for the processes pp to pp, np to np and pp to ppe ^+e^-, pn to pne^+e ^- at laboratory kinetic energies in the 1-5 GeV range. These calculations will be based on relativistic quantum field theory in the one-boson-exchange (pi,rho,omega,sigma,delta, eta) approximation to the nucleon-nucleon scattering problem. There are several independent Feynman diagrams for each process--twenty-five for the case np to npe^+e^ - and forty-eight for the case pp to ppe^+e^- --which, for evaluation, require taking the trace of as many as ten gamma matrices and evaluating an angular integral of a quotient of polynomial functions of initial and final energies, particle masses, coupling constants and so on. These mathematical operations are carried out with the aid of the following algebraic manipulators: for the trace operations we use REDUCE 3.3 on the VAX at the ACS facility and for testing the angular integration algorithms we use MAPLE on the Cray-2 at the Minnesota Supercomputer Institute. Finally, we use Cray-2 Fortran for the resulting numerical substitutions. Gauge invariance is strictly observed while including strong and electromagnetic form factors. The numerical results for these calculations are compared with existing data from the Particle Data Group Booklet and compared with recently released data from the Dilepton Spectrometer (DLS) at the Bevalac of proton on Beryllium. For the latter comparison, the spectrometer's finite acceptance function is introduced before a rapidity and transverse momentum integration.

  20. An improved probability mapping approach to assess genome mosaicism

    PubMed Central

    Zhaxybayeva, Olga; Gogarten, J Peter

    2003-01-01

    Background Maximum likelihood and posterior probability mapping are useful visualization techniques that are used to ascertain the mosaic nature of prokaryotic genomes. However, posterior probabilities, especially when calculated for four-taxon cases, tend to overestimate the support for tree topologies. Furthermore, because of poor taxon sampling four-taxon analyses suffer from sensitivity to the long branch attraction artifact. Here we extend the probability mapping approach by improving taxon sampling of the analyzed datasets, and by using bootstrap support values, a more conservative tool to assess reliability. Results Quartets of orthologous proteins were complemented with homologs from selected reference genomes. The mapping of bootstrap support values from these extended datasets gives results similar to the original maximum likelihood and posterior probability mapping. The more conservative nature of the plotted support values allows to focus further analyses on those protein families that strongly disagree with the majority or plurality of genes present in the analyzed genomes. Conclusion Posterior probability is a non-conservative measure for support, and posterior probability mapping only provides a quick estimation of phylogenetic information content of four genomes. This approach can be utilized as a pre-screen to select genes that might have been horizontally transferred. Better taxon sampling combined with subtree analyses prevents the inconsistencies associated with four-taxon analyses, but retains the power of visual representation. Nevertheless, a case-by-case inspection of individual multi-taxon phylogenies remains necessary to differentiate unrecognized paralogy and shared phylogenetic reconstruction artifacts from horizontal gene transfer events. PMID:12974984

  1. An automated approach to mapping corn from Landsat imagery

    USGS Publications Warehouse

    Maxwell, S.K.; Nuckols, J.R.; Ward, M.H.; Hoffer, R.M.

    2004-01-01

    Most land cover maps generated from Landsat imagery involve classification of a wide variety of land cover types, whereas some studies may only need spatial information on a single cover type. For example, we required a map of corn in order to estimate exposure to agricultural chemicals for an environmental epidemiology study. Traditional classification techniques, which require the collection and processing of costly ground reference data, were not feasible for our application because of the large number of images to be analyzed. We present a new method that has the potential to automate the classification of corn from Landsat satellite imagery, resulting in a more timely product for applications covering large geographical regions. Our approach uses readily available agricultural areal estimates to enable automation of the classification process resulting in a map identifying land cover as 'highly likely corn,' 'likely corn' or 'unlikely corn.' To demonstrate the feasibility of this approach, we produced a map consisting of the three corn likelihood classes using a Landsat image in south central Nebraska. Overall classification accuracy of the map was 92.2% when compared to ground reference data. ?? 2003 Elsevier B.V. All rights reserved.

  2. Comparison of mapping approaches of design annual maximum daily precipitation

    NASA Astrophysics Data System (ADS)

    Szolgay, J.; Parajka, J.; Kohnová, S.; Hlavčová, K.

    2009-05-01

    In this study 2-year and 100-year annual maximum daily precipitation for rainfall-runoff studies and estimating flood hazard were mapped. The daily precipitation measurements at 23 climate stations from 1961-2000 were used in the upper Hron basin in central Slovakia. The choice of data preprocessing and interpolation methods was guided by their practical applicability and acceptance in the engineering hydrologic community. The main objective was to discuss the quality and properties of maps of design precipitation with a given return period with respect to the expectations of the end user. Four approaches to the preprocessing of annual maximum 24-hour precipitation data were used, and three interpolation methods employed. The first approach is the direct mapping of at-site estimates of distribution function quantiles; the second is the direct mapping of local estimates of the three parameters of the GEV distribution. In the third, the daily precipitation totals were interpolated into a regular grid network, and then the time series of the maximum daily precipitation totals in each grid point of the selected region were statistically analysed. In the fourth, the spatial distribution of the design precipitation was modeled by quantiles predicted by regional precipitation frequency analysis using the Hosking and Wallis procedure. The three interpolation methods used were the inverse distance weighting, nearest neighbor and the kriging method. Visual inspection and jackknife cross-validation were used to compare the combination of approaches.

  3. Noise pollution mapping approach and accuracy on landscape scales.

    PubMed

    Iglesias Merchan, Carlos; Diaz-Balteiro, Luis

    2013-04-01

    Noise mapping allows the characterization of environmental variables, such as noise pollution or soundscape, depending on the task. Strategic noise mapping (as per Directive 2002/49/EC, 2002) is a tool intended for the assessment of noise pollution at the European level every five years. These maps are based on common methods and procedures intended for human exposure assessment in the European Union that could be also be adapted for assessing environmental noise pollution in natural parks. However, given the size of such areas, there could be an alternative approach to soundscape characterization rather than using human noise exposure procedures. It is possible to optimize the size of the mapping grid used for such work by taking into account the attributes of the area to be studied and the desired outcome. This would then optimize the mapping time and the cost. This type of optimization is important in noise assessment as well as in the study of other environmental variables. This study compares 15 models, using different grid sizes, to assess the accuracy of the noise mapping of the road traffic noise at a landscape scale, with respect to noise and landscape indicators. In a study area located in the Manzanares High River Basin Regional Park in Spain, different accuracy levels (Kappa index values from 0.725 to 0.987) were obtained depending on the terrain and noise source properties. The time taken for the calculations and the noise mapping accuracy results reveal the potential for setting the map resolution in line with decision-makers' criteria and budget considerations. PMID:23416205

  4. Comparing the performance of various digital soil mapping approaches to map physical soil properties

    NASA Astrophysics Data System (ADS)

    Laborczi, Annamária; Takács, Katalin; Pásztor, László

    2015-04-01

    Spatial information on physical soil properties is intensely expected, in order to support environmental related and land use management decisions. One of the most widely used properties to characterize soils physically is particle size distribution (PSD), which determines soil water management and cultivability. According to their size, different particles can be categorized as clay, silt, or sand. The size intervals are defined by national or international textural classification systems. The relative percentage of sand, silt, and clay in the soil constitutes textural classes, which are also specified miscellaneously in various national and/or specialty systems. The most commonly used is the classification system of the United States Department of Agriculture (USDA). Soil texture information is essential input data in meteorological, hydrological and agricultural prediction modelling. Although Hungary has a great deal of legacy soil maps and other relevant soil information, it often occurs, that maps do not exist on a certain characteristic with the required thematic and/or spatial representation. The recent developments in digital soil mapping (DSM), however, provide wide opportunities for the elaboration of object specific soil maps (OSSM) with predefined parameters (resolution, accuracy, reliability etc.). Due to the simultaneous richness of available Hungarian legacy soil data, spatial inference methods and auxiliary environmental information, there is a high versatility of possible approaches for the compilation of a given soil map. This suggests the opportunity of optimization. For the creation of an OSSM one might intend to identify the optimum set of soil data, method and auxiliary co-variables optimized for the resources (data costs, computation requirements etc.). We started comprehensive analysis of the effects of the various DSM components on the accuracy of the output maps on pilot areas. The aim of this study is to compare and evaluate different

  5. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches

    NASA Astrophysics Data System (ADS)

    Igoshev, P. A.; Timirgazin, M. A.; Gilmutdinov, V. F.; Arzhnikov, A. K.; Irkhin, V. Yu

    2015-11-01

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed.

  6. Spiral magnetism in the single-band Hubbard model: the Hartree-Fock and slave-boson approaches.

    PubMed

    Igoshev, P A; Timirgazin, M A; Gilmutdinov, V F; Arzhnikov, A K; Irkhin, V Yu

    2015-11-11

    The ground-state magnetic phase diagram is investigated within the single-band Hubbard model for square and different cubic lattices. The results of employing the generalized non-correlated mean-field (Hartree-Fock) approximation and generalized slave-boson approach by Kotliar and Ruckenstein with correlation effects included are compared. We take into account commensurate ferromagnetic, antiferromagnetic, and incommensurate (spiral) magnetic phases, as well as phase separation into magnetic phases of different types, which was often lacking in previous investigations. It is found that the spiral states and especially ferromagnetism are generally strongly suppressed up to non-realistically large Hubbard U by the correlation effects if nesting is absent and van Hove singularities are well away from the paramagnetic phase Fermi level. The magnetic phase separation plays an important role in the formation of magnetic states, the corresponding phase regions being especially wide in the vicinity of half-filling. The details of non-collinear and collinear magnetic ordering for different cubic lattices are discussed. PMID:26465091

  7. Uncertainty in Coastal Inundation Mapping: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Leon, J. X.; Callaghan, D. P.; Heuvelink, G.; Mills, M.; Phinn, S. R.

    2014-12-01

    Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly as extreme high sea levels and associated erosion are forecasted to increase in magnitude. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis propagate into the inundation mapping. Error propagation within spatial modelling can be appropriately analysed using, for example, a probabilistic framework based on geostatistical simulations. Geostatistical modelling takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The aim of this study was to elaborate probability maps incorporating the impacts of spatially variable and spatially correlated elevation errors in high-resolution DEMs combined with sea level rise uncertainties. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. Sea level rise uncertainty was non-parametrically modelled using 1000 Monte Carlo estimations which were processed to provide the probability density function numerically. The sea level rise uncertainties were modelled using a Weibull distribution with 0.95 scale and 2.2 shape parameters. These uncertainties were combined through addition (i.e., assuming they are independent), and when using probability density distributions, requires a convolution. This probabilistic approach can be used in a risk-aversive decision making process by planning for

  8. Wildfire susceptibility mapping: comparing deterministic and stochastic approaches

    NASA Astrophysics Data System (ADS)

    Pereira, Mário; Leuenberger, Michael; Parente, Joana; Tonini, Marj

    2016-04-01

    Estimating the probability of wildfire-occurrence in a certain area under particular environmental conditions represents a modern tool to support forest protection plans and to reduce fires consequences. This can be performed by the implementation of wildfire susceptibility mapping, normally achieved employing more or less sophisticated models which combine the predisposing variables (as raster datasets) into a geographic information systems (GIS). The selection of the appropriate variables includes the evaluation of success and the implementation of prediction curves, as well as independent probabilistic validations for different scenarios. These methods allow to define the spatial pattern of wildfire-occurrences, characterize the susceptibility of the territory, namely for specific fire causes/types, and can also account for other factors such as human behavior and social aspects. We selected Portugal as the study region which, due to its favorable climatic, topographic and vegetation conditions, is by far the European country most affected by wildfires. In addition, Verde and Zêzere (2010) performed a first assessment and validation of wildfire susceptibility and hazard in Portugal which can be used as benchmarking. The objectives of the present study comprise: (1) assessing the structural forest fire risk in Portugal using updated datasets, namely, with higher spatial resolution (80 m to 25 m), most recent vegetation cover (Corine Land Cover), longer fire history (1975-2013); and, (2) comparing linear vs non-linear approaches for wildfire susceptibility mapping. The data we used includes: (i) a DEM derived from the Shuttle Radar Topographic Mission in a resolution of 1 arc-seconds (DEM-SRTM 25 m) to assess elevation and slope; (ii) the Corine Land Cover inventory provided by the European Environment Agency (http://www.eea.europa.eu/pt) to produce the land use land cover map; (iii) the National Mapping Burnt Areas (NMBA) provided by the Institute for the

  9. Extended self-energy functional approach for strongly correlated lattice bosons in the superfluid phase

    SciTech Connect

    Arrigoni, Enrico; Knap, Michael; Linden, Wolfgang von der

    2011-07-01

    Among the various numerical techniques to study the physics of strongly correlated quantum many-body systems, the self-energy functional approach (SFA) has become increasingly important. In its previous form, however, SFA is not applicable to Bose-Einstein condensation or superfluidity. In this paper, we show how to overcome this shortcoming. To this end, we identify an appropriate quantity, which we term D, that represents the correlation correction of the condensate order parameter, as it does the self-energy for Green's function. An appropriate functional is derived, which is stationary at the exact physical realization of D and of the self-energy. Its derivation is based on a functional-integral representation of the grand potential followed by an appropriate sequence of Legendre transformations. The approach is not perturbative and, therefore, applicable to a wide range of models with local interactions. We show that the variational cluster approach based on the extended self-energy functional is equivalent to the ''pseudoparticle'' approach proposed in Phys. Rev. B 83, 134507 (2011). We present results for the superfluid density in the two-dimensional Bose-Hubbard model, which shows a remarkable agreement with those of quantum-Monte-Carlo calculations.

  10. A covariance fitting approach for correlated acoustic source mapping.

    PubMed

    Yardibi, Tarik; Li, Jian; Stoica, Petre; Zawodny, Nikolas S; Cattafesta, Louis N

    2010-05-01

    Microphone arrays are commonly used for noise source localization and power estimation in aeroacoustic measurements. The delay-and-sum (DAS) beamformer, which is the most widely used beamforming algorithm in practice, suffers from low resolution and high sidelobe level problems. Therefore, deconvolution approaches, such as the deconvolution approach for the mapping of acoustic sources (DAMAS), are often used for extracting the actual source powers from the contaminated DAS results. However, most deconvolution approaches assume that the sources are uncorrelated. Although deconvolution algorithms that can deal with correlated sources, such as DAMAS for correlated sources, do exist, these algorithms are computationally impractical even for small scanning grid sizes. This paper presents a covariance fitting approach for the mapping of acoustic correlated sources (MACS), which can work with uncorrelated, partially correlated or even coherent sources with a reasonably low computational complexity. MACS minimizes a quadratic cost function in a cyclic manner by making use of convex optimization and sparsity, and is guaranteed to converge at least locally. Simulations and experimental data acquired at the University of Florida Aeroacoustic Flow Facility with a 63-element logarithmic spiral microphone array in the absence of flow are used to demonstrate the performance of MACS. PMID:21117743

  11. Teaching Map Skills: An Inductive Approach. Part Three.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    1985-01-01

    These learning activities involve secondary geography students in making a turf map, using map grids, solving problems dealing with map scale, and making a map scale. Complete teacher instructions are provided. (RM)

  12. Phase diagram of ultracold atoms in optical lattices: Comparative study of slave fermion and slave boson approaches to Bose-Hubbard model

    SciTech Connect

    Yu Yue; Chui, S. T.

    2005-03-01

    We perform a comparative study of the finite temperature behavior of ultracold Bose atoms in optical lattices by the slave fermion and the slave boson approaches to the Bose-Hubbard model. The phase diagram of the system is presented. Although both approaches are equivalent without approximations, the mean field theory based on the slave fermion technique is quantitatively more appropriate. Conceptually, the slave fermion approach automatically excludes the double occupancy of two identical fermions on the same lattice site. By comparing to known results in limiting cases, we find the slave fermion approach better than the slave boson approach. For example, in the non-interacting limit, the critical temperature of the superfluid-normal liquid transition calculated by the slave fermion approach is closer to the well-known ideal Bose gas result. At zero-temperature limit of the critical interaction, strength from the slave fermion approach is also closer to that from the direct calculation using a zero-temperature mean field theory.

  13. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (K-Map) Generation Skill

    ERIC Educational Resources Information Center

    Görgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  14. The Effects of Reciprocal Teaching and Direct Instruction Approaches on Knowledge Map (k-map) Generation Skill

    ERIC Educational Resources Information Center

    Gorgen, Izzet

    2014-01-01

    The primary purpose of the present study is to investigate whether reciprocal teaching approach or direct instruction approach is more effective in the teaching of k-map generation skill. Secondary purpose of the study is to determine which of the k-map generation principles are more challenging for students to apply. The results of the study…

  15. Canonical map approach to channeling stability in crystals. II

    NASA Astrophysics Data System (ADS)

    Sáenz, A. W.

    1987-11-01

    A nonrelativistic and a relativistic classical Hamiltonian model of two degrees of freedom are considered describing the plane motion of a particle in a potential V(x1,x2)[(x1,x2) =Cartesian coordinates]. Suppose V(x1,x2) is real analytic in its arguments in a neighborhood of the line x2=0, one-periodic in x1 there, and such that the average value of ∂V(x1,0)/∂x2 vanishes. It is proved that, under these conditions and provided that the particle energy E is sufficiently large, there exist for all time two distinguished solutions, one satisfying the equations of motion of the nonrelativistic model and the other those of the relativistic model, whose corresponding configuration-space orbits are one-periodic in x1 and approach the line x2=0 as E→∞. The main theorem is that these solutions are (future) orbitally stable at large enough E if V satisfies the above conditions, as well as natural requirements of linear and nonlinear stability. To prove their existence, one uses a well-known theorem, for which a new and simpler proof is provided, and properties of certain natural canonical maps appropriate to these respective models. It is shown that such solutions are orbitally stable by reducing the maps in question to Birkhoff canonical form and then applying a version of the Moser twist theorem. The approach used here greatly lightens the labor of deriving key estimates for the above maps, these estimates being needed to effect this reduction. The present stability theorem is physically interesting because it is the first rigorous statement on the orbital stability of certain channeling motions of fast charged particles in rigid two-dimensional lattices, within the context of models of the stated degree of generality.

  16. A filtering approach to edge preserving MAP estimation of images.

    PubMed

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing. PMID:21078580

  17. From the shell model to the interacting boson model

    SciTech Connect

    Ginocchio, J.N.; Johnson, C.W.

    1994-07-01

    Starting from a general, microscopic fermion-pair-to-boson mapping of a complete fermion space that preserves Hermitian conjugation, we show that the resulting infinite and non-convergent boson Hamilitonian can be factored into a finite (e.g., a 1 + 2-body fermion Hamiltonian is mapped to a 1 + 2-body boson Hamiltonian) image Hamilitonian times the norm operator, and it is the norm operator that is infinite and non-convergent. We then truncate to a collective boson space and we give conditions under which the exact boson images of finite fermion operators are also finite in the truncated basis.

  18. Einstein's Gravitational Field Approach to Dark Matter and Dark Energy-Geometric Particle Decay into the Vacuum Energy Generating Higgs Boson and Heavy Quark Mass

    NASA Astrophysics Data System (ADS)

    Christensen, Walter James

    2015-08-01

    During an interview at the Niels Bohr Institute David Bohm stated, "according to Einstein, particles should eventually emerge as singularities, or very strong regions of stable pulses of (the gravitational) field" [1]. Starting from this premise, we show spacetime, indeed, manifests stable pulses (n-valued gravitons) that decay into the vacuum energy to generate all three boson masses (including Higgs), as well as heavy-quark mass; and all in precise agreement with the 2010 CODATA report on fundamental constants. Furthermore, our relativized quantum physics approach (RQP) answers to the mystery surrounding dark energy, dark matter, accelerated spacetime, and why ordinary matter dominates over antimatter.

  19. Teaching Population Health: A Competency Map Approach to Education

    PubMed Central

    Kaprielian, Victoria S.; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K.; Murphy, Gwen; Tran, Anh N.; Sheline, Barbara L.; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J.; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H.; Scutchfield, F. Douglas; Michener, J. Lloyd

    2013-01-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals’ training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina to improve the local community’s health. Based on these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke’s efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  20. Teaching population health: a competency map approach to education.

    PubMed

    Kaprielian, Victoria S; Silberberg, Mina; McDonald, Mary Anne; Koo, Denise; Hull, Sharon K; Murphy, Gwen; Tran, Anh N; Sheline, Barbara L; Halstater, Brian; Martinez-Bianchi, Viviana; Weigle, Nancy J; de Oliveira, Justine Strand; Sangvai, Devdutta; Copeland, Joyce; Tilson, Hugh H; Scutchfield, F Douglas; Michener, J Lloyd

    2013-05-01

    A 2012 Institute of Medicine report is the latest in the growing number of calls to incorporate a population health approach in health professionals' training. Over the last decade, Duke University, particularly its Department of Community and Family Medicine, has been heavily involved with community partners in Durham, North Carolina, to improve the local community's health. On the basis of these initiatives, a group of interprofessional faculty began tackling the need to fill the curriculum gap to train future health professionals in public health practice, community engagement, critical thinking, and team skills to improve population health effectively in Durham and elsewhere. The Department of Community and Family Medicine has spent years in care delivery redesign and curriculum experimentation, design, and evaluation to distinguish the skills trainees and faculty need for population health improvement and to integrate them into educational programs. These clinical and educational experiences have led to a set of competencies that form an organizational framework for curricular planning and training. This framework delineates which learning objectives are appropriate and necessary for each learning level, from novice through expert, across multiple disciplines and domains. The resulting competency map has guided Duke's efforts to develop, implement, and assess training in population health for learners and faculty. In this article, the authors describe the competency map development process as well as examples of its application and evaluation at Duke and limitations to its use with the hope that other institutions will apply it in different settings. PMID:23524919

  1. Mapping African animal trypanosomosis risk: the landscape approach.

    PubMed

    Guerrini, Laure; Bouyer, Jérémy

    2007-01-01

    African animal trypanosomosis (AAT) is a major hindrance to cattle breeding in the Mouhoun River Basin of Burkina Faso. The authors describe a landscape approach that enables the mapping of tsetse densities and AAT risk along the Mouhoun River loop (702 km long) in Burkina Faso. Three epidemiological landscapes were described: the first and most dangerous corresponded to protected forests and their border areas, with a 0.74 apparent density of infectious fly per trap per day (ADTi), the second to a partially disturbed vegetal formation, with a 0.20 ADTi and the third to a completely disturbed landscape with a 0.08 ADTi. Using this risk indicator, the first landscape was 3.92 more risky than the second which was 3.13 more risky than the last. Similar infectious rates were found in all landscapes (approximately 8%) but tsetse apparent densities dropped significantly (p<0.001) in half-disturbed (2.66) and disturbed landscapes (0.80) in comparison to the natural and border landscapes (11.77). Females were significantly younger (mean physiological age of 29 days) only in the most disturbed landscape (p<0.05) than in the two others one (41 days). According to these results, practical implications of stratifying AAT risk and mapping tsetse densities in vector control campaigns are discussed. PMID:20422544

  2. Mapping the distribution of malaria: current approaches and future directions

    USGS Publications Warehouse

    Johnson, Leah R.; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.

    2015-01-01

    Mapping the distribution of malaria has received substantial attention because the disease is a major source of illness and mortality in humans, especially in developing countries. It also has a defined temporal and spatial distribution. The distribution of malaria is most influenced by its mosquito vector, which is sensitive to extrinsic environmental factors such as rainfall and temperature. Temperature also affects the development rate of the malaria parasite in the mosquito. Here, we review the range of approaches used to model the distribution of malaria, from spatially explicit to implicit, mechanistic to correlative. Although current methods have significantly improved our understanding of the factors influencing malaria transmission, significant gaps remain, particularly in incorporating nonlinear responses to temperature and temperature variability. We highlight new methods to tackle these gaps and to integrate new data with models.

  3. A statistical approach for validating eSOTER and digital soil maps in front of traditional soil maps

    NASA Astrophysics Data System (ADS)

    Bock, Michael; Baritz, Rainer; Köthe, Rüdiger; Melms, Stephan; Günther, Susann

    2015-04-01

    During the European research project eSOTER, three different Digital Soil Maps (DSM) were developed for the pilot area Chemnitz 1:250,000 (FP7 eSOTER project, grant agreement nr. 211578). The core task of the project was to revise the SOTER method for the interpretation of soil and terrain data. It was one of the working hypothesis that eSOTER does not only provide terrain data with typical soil profiles, but that the new products actually perform like a conceptual soil map. The three eSOTER maps for the pilot area considerably differed in spatial representation and content of soil classes. In this study we compare the three eSOTER maps against existing reconnaissance soil maps keeping in mind that traditional soil maps have many subjective issues and intended bias regarding the overestimation and emphasize of certain features. Hence, a true validation of the proper representation of modeled soil maps is hardly possible; rather a statistical comparison between modeled and empirical approaches is possible. If eSOTER data represent conceptual soil maps, then different eSOTER, DSM and conventional maps from various sources and different regions could be harmonized towards consistent new data sets for large areas including the whole European continent. One of the eSOTER maps has been developed closely to the traditional SOTER method: terrain classification data (derived from SRTM DEM) were combined with lithology data (re-interpreted geological map); the corresponding terrain units were then extended with soil information: a very dense regional soil profile data set was used to define soil mapping units based on a statistical grouping of terrain units. The second map is a pure DSM map using continuous terrain parameters instead of terrain classification; radiospectrometric data were used to supplement parent material information from geology maps. The classification method Random Forest was used. The third approach predicts soil diagnostic properties based on

  4. A multi-model ensemble approach to seabed mapping

    NASA Astrophysics Data System (ADS)

    Diesing, Markus; Stephens, David

    2015-06-01

    Seabed habitat mapping based on swath acoustic data and ground-truth samples is an emergent and active marine science discipline. Significant progress could be achieved by transferring techniques and approaches that have been successfully developed and employed in such fields as terrestrial land cover mapping. One such promising approach is the multiple classifier system, which aims at improving classification performance by combining the outputs of several classifiers. Here we present results of a multi-model ensemble applied to multibeam acoustic data covering more than 5000 km2 of seabed in the North Sea with the aim to derive accurate spatial predictions of seabed substrate. A suite of six machine learning classifiers (k-Nearest Neighbour, Support Vector Machine, Classification Tree, Random Forest, Neural Network and Naïve Bayes) was trained with ground-truth sample data classified into seabed substrate classes and their prediction accuracy was assessed with an independent set of samples. The three and five best performing models were combined to classifier ensembles. Both ensembles led to increased prediction accuracy as compared to the best performing single classifier. The improvements were however not statistically significant at the 5% level. Although the three-model ensemble did not perform significantly better than its individual component models, we noticed that the five-model ensemble did perform significantly better than three of the five component models. A classifier ensemble might therefore be an effective strategy to improve classification performance. Another advantage is the fact that the agreement in predicted substrate class between the individual models of the ensemble could be used as a measure of confidence. We propose a simple and spatially explicit measure of confidence that is based on model agreement and prediction accuracy.

  5. Symmetry-improved 2PI approach to the Goldstone-boson IR problem of the SM effective potential

    NASA Astrophysics Data System (ADS)

    Pilaftsis, Apostolos; Teresi, Daniele

    2016-05-01

    The effective potential of the Standard Model (SM), from three loop order and higher, suffers from infrared (IR) divergences arising from quantum effects due to massless would-be Goldstone bosons associated with the longitudinal polarizations of the W± and Z bosons. Such IR pathologies also hinder accurate evaluation of the two-loop threshold corrections to electroweak quantities, such as the vacuum expectation value of the Higgs field. However, these divergences are an artifact of perturbation theory, and therefore need to be consistently resummed in order to obtain an IR-safe effective potential. The so-called Two-Particle-Irreducible (2PI) effective action provides a rigorous framework to consistently perform such resummations, without the need to resort to ad hoc subtractions or running into the risk of over-counting contributions. By considering the recently proposed symmetry-improved 2PI formalism, we address the problem of the Goldstone-boson IR divergences of the SM effective potential in the gaugeless limit of the theory. In the same limit, we evaluate the IR-safe symmetry-improved 2PI effective potential, after taking into account quantum loops of chiral fermions, as well as the renormalization of spurious custodially breaking effects triggered by fermionic Yukawa interactions. Finally, we compare our results with those obtained with other methods presented in the literature.

  6. Pure P2P mediation system: A mappings discovery approach

    NASA Astrophysics Data System (ADS)

    selma, El yahyaoui El idrissi; Zellou, Ahmed; Idri, Ali

    2015-02-01

    The information integration systems consist in offering a uniform interface to provide access to a set of autonomous and distributed information sources. The most important advantage of this system is that it allows users to specify what they want, rather than thinking about how to get the responses. The works realized in this area have particular leads to two major classes of integration systems: the mediation systems based on the paradigm mediator / adapter and peer to peer systems (P2P). The combination of both systems has led to a third type; is the mediation P2P systems. The P2P systems are large-scale systems, self-organized and distributed. They allow the resource management in a completely decentralized way. However, the integration of structured information sources, heterogeneous and distributed proves to be a complex problem. The objective of this work is to propose an approach to resolve conflicts and establish a mapping between the heterogeneous elements. This approach is based on clustering; the latter is to group similar Peers that share common information in the same subnet. Thus, to facilitate the heterogeneity, we introduced three additional layers of our hierarchy of peers: internal schema, external schema and Schema directory peer. We used linguistic techniques, and precisely the name correspondence technique, that is based on the similarity of names to propose a correspondence.

  7. Uncertainty propagation in a cascade modelling approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Rodríguez-Rincón, J. P.; Pedrozo-Acuña, A.; Breña Naranjo, J. A.

    2014-07-01

    The purpose of this investigation is to study the propagation of meteorological uncertainty within a cascade modelling approach to flood mapping. The methodology is comprised of a Numerical Weather Prediction Model (NWP), a distributed rainfall-runoff model and a standard 2-D hydrodynamic model. The cascade of models is used to reproduce an extreme flood event that took place in the Southeast of Mexico, during November 2009. The event is selected as high quality field data (e.g. rain gauges; discharge) and satellite imagery are available. Uncertainty in the meteorological model (Weather Research and Forecasting model) is evaluated through the use of a multi-physics ensemble technique, which considers twelve parameterization schemes to determine a given precipitation. The resulting precipitation fields are used as input in a distributed hydrological model, enabling the determination of different hydrographs associated to this event. Lastly, by means of a standard 2-D hydrodynamic model, hydrographs are used as forcing conditions to study the propagation of the meteorological uncertainty to an estimated flooded area. Results show the utility of the selected modelling approach to investigate error propagation within a cascade of models. Moreover, the error associated to the determination of the runoff, is showed to be lower than that obtained in the precipitation estimation suggesting that uncertainty do not necessarily increase within a model cascade.

  8. A geostatistical approach to mapping site response spectral amplifications

    USGS Publications Warehouse

    Thompson, E.M.; Baise, L.G.; Kayen, R.E.; Tanaka, Y.; Tanaka, H.

    2010-01-01

    If quantitative estimates of the seismic properties do not exist at a location of interest then the site response spectral amplifications must be estimated from data collected at other locations. Currently, the most common approach employs correlations of site class with maps of surficial geology. Analogously, correlations of site class with topographic slope can be employed where the surficial geology is unknown. Our goal is to identify and validate a method to estimate site response with greater spatial resolution and accuracy for regions where additional effort is warranted. This method consists of three components: region-specific data collection, a spatial model for interpolating seismic properties, and a theoretical method for computing spectral amplifications from the interpolated seismic properties. We consider three spatial interpolation schemes: correlations with surficial geology, termed the geologic trend (GT), ordinary kriging (OK), and kriging with a trend (KT). We estimate the spectral amplifications from seismic properties using the square root of impedance method, thereby linking the frequency-dependent spectral amplifications to the depth-dependent seismic properties. Thus, the range of periods for which this method is applicable is limited by the depth of exploration. A dense survey of near-surface S-wave slowness (Ss) throughout Kobe, Japan shows that the geostatistical methods give more accurate estimates of Ss than the topographic slope and GT methods, and the OK and KT methods perform equally well. We prefer the KT model because it can be seamlessly integrated with geologic maps that cover larger regions. Empirical spectral amplifications show that the region-specific data achieve more accurate estimates of observed median short-period amplifications than the topographic slope method. ?? 2010 Elsevier B.V.

  9. High School Biology: A Group Approach to Concept Mapping.

    ERIC Educational Resources Information Center

    Brown, David S.

    2003-01-01

    Explains concept mapping as an instructional method in cooperative learning environments, and describes a study investigating the effectiveness of concept mapping on student learning during a photosynthesis and cellular respiration unit. Reports on the positive effects of concept mapping in the experimental group. (Contains 16 references.) (YDS)

  10. Teaching Map Skills: An Inductive Approach. Part Four.

    ERIC Educational Resources Information Center

    Anderson, Jeremy

    1985-01-01

    Satisfactory completion of this self-contained map exercise will demonstrate student ability to use symbols, legends, scale, orientation, index, and grid in map reading and map use to give directions for way-finding. The exercise should take one class period to complete. (RM)

  11. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  12. Comparison of Mixed-Model Approaches for Association Mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Association-mapping methods promise to overcome the limitations of linkage-mapping methods. The main objectives of this study were to (i) evaluate various methods for association mapping in the autogamous species wheat using an empirical data set, (ii) determine a marker-based kinship matrix using a...

  13. Mapping diffusion in a living cell via the phasor approach.

    PubMed

    Ranjit, Suman; Lanzano, Luca; Gratton, Enrico

    2014-12-16

    Diffusion of a fluorescent protein within a cell has been measured using either fluctuation-based techniques (fluorescence correlation spectroscopy (FCS) or raster-scan image correlation spectroscopy) or particle tracking. However, none of these methods enables us to measure the diffusion of the fluorescent particle at each pixel of the image. Measurement using conventional single-point FCS at every individual pixel results in continuous long exposure of the cell to the laser and eventual bleaching of the sample. To overcome this limitation, we have developed what we believe to be a new method of scanning with simultaneous construction of a fluorescent image of the cell. In this believed new method of modified raster scanning, as it acquires the image, the laser scans each individual line multiple times before moving to the next line. This continues until the entire area is scanned. This is different from the original raster-scan image correlation spectroscopy approach, where data are acquired by scanning each frame once and then scanning the image multiple times. The total time of data acquisition needed for this method is much shorter than the time required for traditional FCS analysis at each pixel. However, at a single pixel, the acquired intensity time sequence is short; requiring nonconventional analysis of the correlation function to extract information about the diffusion. These correlation data have been analyzed using the phasor approach, a fit-free method that was originally developed for analysis of FLIM images. Analysis using this method results in an estimation of the average diffusion coefficient of the fluorescent species at each pixel of an image, and thus, a detailed diffusion map of the cell can be created. PMID:25517145

  14. Current Approaches Toward Quantitative Mapping of the Interactome

    PubMed Central

    Buntru, Alexander; Trepte, Philipp; Klockmeier, Konrad; Schnoegl, Sigrid; Wanker, Erich E.

    2016-01-01

    Protein–protein interactions (PPIs) play a key role in many, if not all, cellular processes. Disease is often caused by perturbation of PPIs, as recently indicated by studies of missense mutations. To understand the associations of proteins and to unravel the global picture of PPIs in the cell, different experimental detection techniques for PPIs have been established. Genetic and biochemical methods such as the yeast two-hybrid system or affinity purification-based approaches are well suited to high-throughput, proteome-wide screening and are mainly used to obtain qualitative results. However, they have been criticized for not reflecting the cellular situation or the dynamic nature of PPIs. In this review, we provide an overview of various genetic methods that go beyond qualitative detection and allow quantitative measuring of PPIs in mammalian cells, such as dual luminescence-based co-immunoprecipitation, Förster resonance energy transfer or luminescence-based mammalian interactome mapping with bait control. We discuss the strengths and weaknesses of different techniques and their potential applications in biomedical research. PMID:27200083

  15. Interacting boson model from energy density functionals: {gamma}-softness and the related topics

    SciTech Connect

    Nomura, K.

    2012-10-20

    A comprehensive way of deriving the Hamiltonian of the interacting boson model (IBM) is described. Based on the fact that the multi-nucleon induced surface deformation in finite nucleus is simulated by effective boson degrees of freedom, the potential energy surface calculated with self-consistent mean-field method employing a given energy density functional (EDF) is mapped onto the IBM analog, and thereby the excitation spectra and transition rates with good symmetry quantum numbers are calculated. Recent applications of the proposed approach are reported: (i) an alternative robust interpretation of the {gamma}-soft nuclei and (ii) shape coexistence in lead isotopes.

  16. A faster and economical approach to floodplain mapping using the SSURGO soil database

    NASA Astrophysics Data System (ADS)

    Sangwan, N.; Merwade, V.

    2014-12-01

    Floods are the most damaging of all natural disasters, adversely affecting millions of lives and causing financial losses worth billions of dollars every year across the globe. Flood inundation maps play a key role in the assessment and mitigation of potential flood hazards. However, there are several communities in the United States where flood risk maps are not available due to the lack of the resources needed to create such maps through the conventional modeling approach. The objective of this study is to develop and examine an economical alternative approach to floodplain mapping using widely available SSURGO soil data in the United States. By using the state of Indiana as a test case, floodplain maps are developed for the entire state by identifying the flood-prone soil map units based on their attributes recorded in the SSURGO database. For validation, the flood extents obtained from the soil data are compared with the extents predicted by other floodplain maps, including the Federal Emergency Management Agency (FEMA) issued Flood Insurance Rate Maps (FIRM), flood extents observed during past floods, and other flood maps derived using Digital Elevation Models (DEMs). In general, SSURGO based floodplain maps are found to be largely in agreement with flood inundation maps created by FEMA. Comparison between the FEMA maps and the SSURGO derived floodplain maps show an overlap ranging from 65 to 90 percent. Similar results are also found when the SSURGO derived floodplain maps are compared with FEMA maps for recent flood events in other states including Minnesota, Washington and Wisconsin. Although not in perfect conformance with reference flood maps, the SSURGO soil data approach offers an economical and faster alternative to floodplain mapping in areas where detailed flood modeling and mapping has not been conducted.

  17. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  18. Quantitative Architectural Analysis: A New Approach to Cortical Mapping

    ERIC Educational Resources Information Center

    Schleicher, Axel; Morosan, Patricia; Amunts, Katrin; Zilles, Karl

    2009-01-01

    Results from functional imaging studies are often still interpreted using the classical architectonic brain maps of Brodmann and his successors. One obvious weakness in traditional, architectural mapping is the subjective nature of localizing borders between cortical areas by means of a purely visual, microscopical examination of histological…

  19. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-07-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use

  20. Mapping dominant runoff processes: an evaluation of different approaches using similarity measures and synthetic runoff simulations

    NASA Astrophysics Data System (ADS)

    Antonetti, M.; Buss, R.; Scherrer, S.; Margreth, M.; Zappa, M.

    2015-12-01

    The identification of landscapes with similar hydrological behaviour is useful for runoff predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexity of automatic DRP mapping approaches affects hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison and a deviation map were derived. The automatically derived DRP-maps were used in synthetic runoff simulations with an adapted version of the hydrological model PREVAH, and simulation results compared with those from simulations using the reference maps. The DRP-maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP-maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. We therefore recommend not only using expert

  1. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    PubMed

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets. PMID:26336114

  2. Structure-mapping approach to analogy and metaphor

    SciTech Connect

    Gentner, D.

    1982-01-01

    The structure-mapping theory of analogy describes a set of principles by which the interpretation of an analogy is derived from the meanings of its parts. These principles are characterized as implicit rules for mapping knowledge about a base domain into a target domain. Two important features of the theory are that the rules depend only on syntactic properties of the knowledge representation, and not on the specific content of the domains; and the theoretical framework allows analogies to be distinguished cleanly from literal similarity statements, applications of general laws, and other kinds of comparisons. Two mapping principles are described: relations between objects, rather than attributes of objects, are mapped from base to target; and the particular relations mapped are determined by systematicity, as defined by the existence of higher-order relations. 4 references.

  3. Space Borne Swath Mapping Laser Altimeters - Comparison of Measurement Approaches

    NASA Astrophysics Data System (ADS)

    Sun, X.; Abshire, J. B.; Harding, D. J.

    2007-12-01

    Laser altimetry is an important technique for studying the surface topography of the planets and the Earth from orbit. Presently orbital laser altimeters profile surface height along a single ground track, such as the Geoscience Laser Altimeter System (GLAS) on Ice, Cloud, and land Elevation Satellite (ICESat). NASA is developing new technologies for an orbiting swath mapping laser altimeter with faster pulse rate and smaller footprint size to provide an instantaneous 3-dimentional measurement of the of icesheets, land topography and vegetation structure. The goal is to provide a greater than 200 m wide swath with 5 to 10 m diameter laser footprint from a 400 km altitude orbit. To achieve these goals, we have to use more efficient laser transmitters and more sensitive detectors to allow simultaneous multi-channel measurement with a reasonable instrument size and electrical power requirement. The measurement efficiency in terms of electrical energy needed per laser ranging measurement needs to be improved by more than an order of magnitude. Several different approaches were considered, including the use of fiber lasers, shorter laser pulse widths, lower noise analog detectors and photon counting detectors. The receiver sensitivity was further improved by averaging the results from a number of laser pulse measurements. Different laser pulse modulation formats, such as the pseudo random noise code modulation used in the Global Position System (GPS), were investigated to give more flexibility in laser selection and to further improve the ranging performance. We have analyzed and compared measurement performance for several different approaches using the receiver models that was validated with GLAS in orbit measurement data. We compared measurement performance with the traditional high-power low-pulse-rate laser transmitters to those with low-energy high-pulse-rate laser transmitters. For this work we considered laser characteristics representative of Microchip lasers

  4. Study of hole pair condensation based on the SU(2) Slave-Boson approach to the t-J Hamiltonian: Temperature, momentum and doping dependences of spectral functions

    SciTech Connect

    Salk, S.H.S.; Lee, S.S.

    1999-11-01

    Based on the U(1) and SU(2) slave-boson approaches to the t-J Hamiltonian, the authors evaluate the one electron spectral functions for the hole doped high {Tc} cuprates for comparison with the angle resolved photoemission spectroscopy (ARPES) data. They find that the observed quasiparticle peak in the superconducting state is correlated with the hump which exists in the normal state. They find that the spectral weight of the quasiparticle peak increases as doping rate increases, which is consistent with observation. As a consequence of the phase fluctuation effects of the spinon and holon pairing order parameters the spectral weight of the predicted peak obtained from the SU(2) theory is found to be smaller than the one predicted from U(1) mean field theory.

  5. Transboundary aquifer mapping and management in Africa: a harmonised approach

    NASA Astrophysics Data System (ADS)

    Altchenko, Yvan; Villholth, Karen G.

    2013-11-01

    Recent attention to transboundary aquifers (TBAs) in Africa reflects the growing importance of these resources for development in the continent. However, relatively little research on these aquifers and their best management strategies has been published. This report recapitulates progress on mapping and management frameworks for TBAs in Africa. The world map on transboundary aquifers presented at the 6th World Water Forum in 2012 identified 71 TBA systems in Africa. This report presents an updated African TBA map including 80 shared aquifers and aquifer systems superimposed on 63 international river basins. Furthermore, it proposes a new nomenclature for the mapping based on three sub-regions, reflecting the leading regional development communities. The map shows that TBAs represent approximately 42 % of the continental area and 30 % of the population. Finally, a brief review of current international law, specific bi- or multilateral treaties, and TBA management practice in Africa reveals little documented international conflicts over TBAs. The existing or upcoming international river and lake basin organisations offer a harmonised institutional base for TBA management while alternative or supportive models involving the regional development communities are also required. The proposed map and geographical classification scheme for TBAs facilitates identification of options for joint institutional setups.

  6. Prioritising coastal zone management issues through fuzzy cognitive mapping approach.

    PubMed

    Meliadou, Aleka; Santoro, Francesca; Nader, Manal R; Dagher, Manale Abou; Al Indary, Shadi; Salloum, Bachir Abi

    2012-04-30

    Effective public participation is an essential component of Integrated Coastal Zone Management implementation. To promote such participation, a shared understanding of stakeholders' objectives has to be built to ultimately result in common coastal management strategies. The application of quantitative and semi-quantitative methods involving tools such as Fuzzy Cognitive Mapping is presently proposed for reaching such understanding. In this paper we apply the Fuzzy Cognitive Mapping tool to elucidate the objectives and priorities of North Lebanon's coastal productive sectors, and to formalize their coastal zone perceptions and knowledge. Then, we investigate the potential of Fuzzy Cognitive Mapping as tool for support coastal zone management. Five round table discussions were organized; one for the municipalities of the area and one for each of the main coastal productive sectors (tourism, industry, fisheries, agriculture), where the participants drew cognitive maps depicting their views. The analysis of the cognitive maps showed a large number of factors perceived as affecting the current situation of the North Lebanon coastal zone that were classified into five major categories: governance, infrastructure, environment, intersectoral interactions and sectoral initiatives. Furthermore, common problems, expectations and management objectives for all sectors were exposed. Within this context, Fuzzy Cognitive Mapping proved to be an essential tool for revealing stakeholder knowledge and perception and understanding complex relationships. PMID:22325583

  7. Decoherence of spin-deformed bosonic model

    SciTech Connect

    Dehdashti, Sh.; Mahdifar, A.; Bagheri Harouni, M.; Roknizadeh, R.

    2013-07-15

    The decoherence rate and some parameters affecting it are investigated for the generalized spin-boson model. We consider the spin-bosonic model when the bosonic environment is modeled by the deformed harmonic oscillators. We show that the state of the environment approaches a non-linear coherent state. Then, we obtain the decoherence rate of a two-level system which is in contact with a deformed bosonic environment which is either in thermal equilibrium or in the ground state. By using some recent realization of f-deformed oscillators, we show that some physical parameters strongly affect the decoherence rate of a two-level system. -- Highlights: •Decoherence of the generalized spin-boson model is considered. •In this model the environment consists of f-oscillators. •Via the interaction, the state of the environment approaches non-linear coherent states. •Effective parameters on decoherence are considered.

  8. How Albot0 finds its way home: a novel approach to cognitive mapping using robots.

    PubMed

    Yeap, Wai K

    2011-10-01

    Much of what we know about cognitive mapping comes from observing how biological agents behave in their physical environments, and several of these ideas were implemented on robots, imitating such a process. In this paper a novel approach to cognitive mapping is presented whereby robots are treated as a species of their own and their cognitive mapping is being investigated. Such robots are referred to as Albots. The design of the first Albot, Albot0 , is presented. Albot0 computes an imprecise map and employs a novel method to find its way home. Both the map and the return-home algorithm exhibited characteristics commonly found in biological agents. What we have learned from Albot0 's cognitive mapping are discussed. One major lesson is that the spatiality in a cognitive map affords us rich and useful information and this argues against recent suggestions that the notion of a cognitive map is not a useful one. PMID:25164506

  9. Wormholes and Goldstone bosons

    SciTech Connect

    Lee, K.

    1988-07-18

    The quantum theory of a complex scalar field coupled to gravity is considered. A formalism for the semiclassical approach in Euclidean time is developed and used to study wormhole physics. The conserved global charge plays an essential role. Wormhole physics turns on only after the symmetry is spontaneously broken. An effective self-interaction for Goldstone bosons due to wormholes and child universes is shown to be a cosine potential, whose vacuum energy will be reduced by the cosmic expansion. Some implications and questions are discussed.

  10. Eulerian Mapping Closure Approach for Probability Density Function of Concentration in Shear Flows

    NASA Technical Reports Server (NTRS)

    He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Eulerian mapping closure approach is developed for uncertainty propagation in computational fluid mechanics. The approach is used to study the Probability Density Function (PDF) for the concentration of species advected by a random shear flow. An analytical argument shows that fluctuation of the concentration field at one point in space is non-Gaussian and exhibits stretched exponential form. An Eulerian mapping approach provides an appropriate approximation to both convection and diffusion terms and leads to a closed mapping equation. The results obtained describe the evolution of the initial Gaussian field, which is in agreement with direct numerical simulations.

  11. Partnering with Youth to Map Their Neighborhood Environments: A Multi-Layered GIS Approach

    PubMed Central

    Topmiller, Michael; Jacquez, Farrah; Vissman, Aaron T.; Raleigh, Kevin; Miller-Francis, Jenni

    2014-01-01

    Mapping approaches offer great potential for community-based participatory researchers interested in displaying youth perceptions and advocating for change. We describe a multi-layered approach for gaining local knowledge of neighborhood environments that engages youth as co-researchers and active knowledge producers. By integrating geographic information systems (GIS) with environmental audits, an interactive focus group, and sketch mapping, the approach provides a place-based understanding of physical activity resources from the situated experience of youth. Youth report safety and a lack of recreational resources as inhibiting physical activity. Maps reflecting youth perceptions aid policy-makers in making place-based improvements for youth neighborhood environments. PMID:25423245

  12. Raman mapping of oral buccal mucosa: a spectral histopathology approach

    NASA Astrophysics Data System (ADS)

    Behl, Isha; Kukreja, Lekha; Deshmukh, Atul; Singh, S. P.; Mamgain, Hitesh; Hole, Arti R.; Krishna, C. Murali

    2014-12-01

    Oral cancer is one of the most common cancers worldwide. One-fifth of the world's oral cancer subjects are from India and other South Asian countries. The present Raman mapping study was carried out to understand biochemical variations in normal and malignant oral buccal mucosa. Data were acquired using WITec alpha 300R instrument from 10 normal and 10 tumors unstained tissue sections. Raman maps of normal sections could resolve the layers of epithelium, i.e. basal, intermediate, and superficial. Inflammatory, tumor, and stromal regions are distinctly depicted on Raman maps of tumor sections. Mean and difference spectra of basal and inflammatory cells suggest abundance of DNA and carotenoids features. Strong cytochrome bands are observed in intermediate layers of normal and stromal regions of tumor. Epithelium and stromal regions of normal cells are classified by principal component analysis. Classification among cellular components of normal and tumor sections is also observed. Thus, the findings of the study further support the applicability of Raman mapping for providing molecular level insights in normal and malignant conditions.

  13. Computer-Assisted Argument Mapping: A "Rationale" Approach

    ERIC Educational Resources Information Center

    Davies, W. Martin

    2009-01-01

    Computer-Assisted Argument Mapping (CAAM) is a new way of understanding arguments. While still embryonic in its development and application, CAAM is being used increasingly as a training and development tool in the professions and government. Inroads are also being made in its application within education. CAAM claims to be helpful in an…

  14. Job Seekers' Perceptions of Teleworking: A Cognitive Mapping Approach.

    ERIC Educational Resources Information Center

    Kerrin, Maire; Hone, Kate

    2001-01-01

    College students (n=40) and nonstudent job seekers (n=20) rated four dimensions of telework. Results were plotted in cognitive maps. Students preferred office work to telework, citing lack of social interaction. Nonstudents, slightly older and more likely to be parents, slightly preferred telework. Targeting recruitment to account for these…

  15. Cognitions of Expert Supervisors in Academe: A Concept Mapping Approach

    ERIC Educational Resources Information Center

    Kemer, Gülsah; Borders, L. DiAnne; Willse, John

    2014-01-01

    Eighteen expert supervisors reported their thoughts while preparing for, conducting, and evaluating their supervision sessions. Concept mapping (Kane & Trochim, [Kane, M., 2007]) yielded 195 cognitions classified into 25 cognitive categories organized into 5 supervision areas: conceptualization of supervision, supervisee assessment,…

  16. Mapping Sustainability Initiatives across a Region: An Innovative Survey Approach

    ERIC Educational Resources Information Center

    Somerville, Margaret; Green, Monica

    2012-01-01

    The project of mapping sustainability initiatives across a region is part of a larger program of research about place and sustainability education for the Anthropocene, the new geological age of human-induced planetary changes (Zalasiewicz, Williams, Steffen, & Crutzen, 2010). The study investigated the location, nature and type of sustainability…

  17. The Facebook influence model: a concept mapping approach.

    PubMed

    Moreno, Megan A; Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M

    2013-07-01

    Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  18. Trans-ethnic study design approaches for fine-mapping.

    PubMed

    Asimit, Jennifer L; Hatzikotoulas, Konstantinos; McCarthy, Mark; Morris, Andrew P; Zeggini, Eleftheria

    2016-08-01

    Studies that traverse ancestrally diverse populations may increase power to detect novel loci and improve fine-mapping resolution of causal variants by leveraging linkage disequilibrium differences between ethnic groups. The inclusion of African ancestry samples may yield further improvements because of low linkage disequilibrium and high genetic heterogeneity. We investigate the fine-mapping resolution of trans-ethnic fixed-effects meta-analysis for five type II diabetes loci, under various settings of ancestral composition (European, East Asian, African), allelic heterogeneity, and causal variant minor allele frequency. In particular, three settings of ancestral composition were compared: (1) single ancestry (European), (2) moderate ancestral diversity (European and East Asian), and (3) high ancestral diversity (European, East Asian, and African). Our simulations suggest that the European/Asian and European ancestry-only meta-analyses consistently attain similar fine-mapping resolution. The inclusion of African ancestry samples in the meta-analysis leads to a marked improvement in fine-mapping resolution. PMID:26839038

  19. A New Approach for Constructing the Concept Map

    ERIC Educational Resources Information Center

    Tseng, Shian-Shyong; Sue, Pei-Chi; Su, Jun-Ming; Weng, Jui-Feng; Tsai, Wen-Nung

    2007-01-01

    In recent years, e-learning system has become more and more popular and many adaptive learning environments have been proposed to offer learners customized courses in accordance with their aptitudes and learning results. For achieving the adaptive learning, a predefined concept map of a course is often used to provide adaptive learning guidance…

  20. The Facebook Influence Model: A Concept Mapping Approach

    PubMed Central

    Kota, Rajitha; Schoohs, Shari; Whitehill, Jennifer M.

    2013-01-01

    Abstract Facebook is a popular social media Web site that has been hypothesized to exert potential influence over users' attitudes, intentions, or behaviors. The purpose of this study was to develop a conceptual framework to explain influential aspects of Facebook. This mixed methods study applied concept mapping methodology, a validated five-step method to visually represent complex topics. The five steps comprise preparation, brainstorming, sort and rank, analysis, and interpretation. College student participants were identified using purposeful sampling. The 80 participants had a mean age of 20.5 years, and included 36% males. A total of 169 statements were generated during brainstorming, and sorted into between 6 and 22 groups. The final concept map included 13 clusters. Interpretation data led to grouping of clusters into four final domains, including connection, comparison, identification, and Facebook as an experience. The Facebook Influence Concept Map illustrates key constructs that contribute to influence, incorporating perspectives of older adolescent Facebook users. While Facebook provides a novel lens through which to consider behavioral influence, it can best be considered in the context of existing behavioral theory. The concept map may be used toward development of potential future intervention efforts. PMID:23621717

  1. Higgs boson at LHC: a diffractive opportunity

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2009-03-23

    An alternative process is presented for diffractive Higgs boson production in peripheral pp collisions, where the particles interact through the Double Pomeron Exchange. The event rate is computed as a central-rapidity distribution for Tevatron and LHC energies leading to a result around 0.6 pb, higher than the predictions from previous approaches. Therefore, this result arises as an enhanced signal for the detection of the Higgs boson in hadron colliders. The predictions for the Higgs boson photoproduction are compared to the ones obtained from a similar approach proposed by the Durham group, enabling an analysis of the future developments of its application to pp and AA collisions.

  2. Topographic Mapping of Mars: Approaching the Human Scale

    NASA Astrophysics Data System (ADS)

    Kirk, R. L.; Howington-Kraus, E.; Soderblom, L. A.; Archinal, B. A.

    2002-12-01

    In only three decades, topographic mapping of Mars has progressed from the planetary to the personal scale. The first crude contour maps of the early 1970s, based on Earth-based radar and atmospheric occultation and sounding data, revealed such continental-scale features as the Tharsis bulge. Stereoanalysis of Mariner 9 and Viking Orbiter images filled in some of the details, yielding by the late 1980s a global digital elevation model (DEM) interpolated from 1-km contours and containing systematic errors of many km. This DEM was superseded in the 1990s by data from the Mars Orbiter Laser Altimeter (MOLA), with an accuracy <10 m vertically and ~ 100 m horizontally. MOLA has provided the definitive global map of Mars for the foreseeable future; its most significant weakness is its sample spacing (300 m along-track, with many gaps >1 km and a few up to 10 km between orbit tracks). Stereoanalysis of images from the narrow-angle Mars Orbiter Camera (MOC-NA) can be used to produce local DEMs with a vertical precision similar to MOLA (e.g., ~ 3 m for 3 m/pixel images with ~ 10° convergence), horizontal resolution of 3 pixels (~ 10 m for 3 m images), and control to MOLA for absolute accuracy comparable to the latter. Over 150 MOC-NA stereopairs have been identified, and more continue to be obtained. We will describe our use of the USGS cartographic system ISIS with commercial photogrammetric software SOCET SET (© BAE Systems) to produce DEMs from such pairs. This and similar work by other groups brings topographic mapping close to the scale of features seen from the ground and processes active at the present day. We are also using high-resolution stereo DEMs (and, in some cases, altimetry) as the starting point for calibration of photoclinometry, which yields DEMs with a horizontal resolution of one pixel and a local vertical precision of a small fraction of a pixel. The techniques we describe are directly applicable to other Mars imagers both present (THEMIS) and

  3. The Higgs Boson.

    ERIC Educational Resources Information Center

    Veltman, Martinus J. G.

    1986-01-01

    Reports recent findings related to the particle Higgs boson and examines its possible contribution to the standard mode of elementary processes. Critically explores the strengths and uncertainties of the Higgs boson and proposed Higgs field. (ML)

  4. Concept Map Engineering: Methods and Tools Based on the Semantic Relation Approach

    ERIC Educational Resources Information Center

    Kim, Minkyu

    2013-01-01

    The purpose of this study is to develop a better understanding of technologies that use natural language as the basis for concept map construction. In particular, this study focuses on the semantic relation (SR) approach to drawing rich and authentic concept maps that reflect students' internal representations of a problem situation. The…

  5. A Time Sequence-Oriented Concept Map Approach to Developing Educational Computer Games for History Courses

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Yang, Kai-Hsiang; Chen, Jing-Hong

    2015-01-01

    Concept maps have been recognized as an effective tool for students to organize their knowledge; however, in history courses, it is important for students to learn and organize historical events according to the time of their occurrence. Therefore, in this study, a time sequence-oriented concept map approach is proposed for developing a game-based…

  6. Using a Linkage Mapping Approach to Identify QTL for Day-Neutrality in the Octoploid Strawberry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A linkage mapping approach was used to identify quantitative trait loci (QTL) associated with day-neutrality in the commercial strawberry, Fragaria ×ananassa (Duch ex Rozier). Amplified Fragment Length Polymorphic (AFLP) markers were used to build a genetic map with a population of 127 lines develo...

  7. Transfer map approach to the beam-beam interaction

    NASA Astrophysics Data System (ADS)

    Dragt, Alex J.

    1980-01-01

    A study is made of a model for the beam-beam interaction in ISABELLE using numerical methods and the recently developed method of Transfer Maps. It is found that analytical transfer map calculations account qualitatively for all the features of the model obtions account qualitatively for all the features of the model observed numerically, and show promise of giving quantitive agreement as well. They may also provide a kind of ''magnifying glass'' for examining numerical results in fine detail to ascertain the presence of small scale stochastic motion that might lead to eventual particle loss. Preliminary evidence is presented to the effect that within the model employed, the beam-beam interaction at its contemplated strengths should not lead to particle loss in ISABELLE.

  8. A Knowledge Intensive Approach to Mapping Clinical Narrative to LOINC

    PubMed Central

    Fiszman, Marcelo; Shin, Dongwook; Sneiderman, Charles A.; Jin, Honglan; Rindflesch, Thomas C.

    2010-01-01

    Many natural language processing systems are being applied to clinical text, yet clinically useful results are obtained only by honing a system to a particular context. We suggest that concentration on the information needed for this processing is crucial and present a knowledge intensive methodology for mapping clinical text to LOINC. The system takes published case reports as input and maps vital signs and body measurements and reports of diagnostic procedures to fully specified LOINC codes. Three kinds of knowledge are exploited: textual, ontological, and pragmatic (including information about physiology and the clinical process). Evaluation on 4809 sentences yielded precision of 89% and recall of 93% (F-score 0.91). Our method could form the basis for a system to provide semi-automated help to human coders. PMID:21346974

  9. About measurements of Higgs boson parity

    NASA Astrophysics Data System (ADS)

    Ginzburg, I. F.

    2016-02-01

    Recently CMS and ATLAS announced that they had measured the Higgs boson parity. In this note we show that their approach can determine this parity only under the additional assumption that an extension of Standard Model of some special type is realized in Nature. We show that the used approach gives no information about the Higgs boson parity when assuming most other extensions of the Standard Model.

  10. Comparison of Sub-pixel Classification Approaches for Crop-specific Mapping

    EPA Science Inventory

    The Moderate Resolution Imaging Spectroradiometer (MODIS) data has been increasingly used for crop mapping and other agricultural applications. Phenology-based classification approaches using the NDVI (Normalized Difference Vegetation Index) 16-day composite (250 m) data product...

  11. An approach to reduce mapping errors in the production of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, M.; Marchesini, I.; Bucci, F.; Cardinali, M.; Fiorucci, F.; Guzzetti, F.

    2015-07-01

    Landslide inventory maps (LIMs) show where landslides have occurred in an area, and provide information useful to different types of landslide studies, including susceptibility and hazard modelling and validation, risk assessment, erosion analyses, and to evaluate relationships between landslides and geological settings. Despite recent technological advancements, visual interpretation of aerial photographs (API) remains the most common method to prepare LIMs. In this work, we present a new semi-automatic procedure that exploits GIS technology for the digitalization of landslide data obtained through API. To test the procedure, and to compare it to a consolidated landslide mapping method, we prepared two LIMs starting from the same set of landslide API data, which were digitalized (a) manually adopting a consolidated visual transfer method, and (b) adopting our new semi-automatic procedure. Results indicate that the new semi-automatic procedure is more efficient and results in a more accurate LIM. With the new procedure, the landslide positional error decreases with increasing landslide size following a power-law. We expect that our work will help adopt standards for transferring landslide information from the aerial photographs to a digital landslide map, contributing to the production of accurate landslide maps.

  12. An approach to reduce mapping errors in the production of landslide inventory maps

    NASA Astrophysics Data System (ADS)

    Santangelo, M.; Marchesini, I.; Bucci, F.; Cardinali, M.; Fiorucci, F.; Guzzetti, F.

    2015-09-01

    Landslide inventory maps (LIMs) show where landslides have occurred in an area, and provide information useful to different types of landslide studies, including susceptibility and hazard modelling and validation, risk assessment, erosion analyses, and to evaluate relationships between landslides and geological settings. Despite recent technological advancements, visual interpretation of aerial photographs (API) remains the most common method to prepare LIMs. In this work, we present a new semi-automatic procedure that makes use of GIS technology for the digitization of landslide data obtained through API. To test the procedure, and to compare it to a consolidated landslide mapping method, we prepared two LIMs starting from the same set of landslide API data, which were digitized (a) manually adopting a consolidated visual transfer method, and (b) adopting our new semi-automatic procedure. Results indicate that the new semi-automatic procedure (a) increases the interpreter's overall efficiency by a factor of 2, (b) reduces significantly the subjectivity introduced by the visual (manual) transfer of the landslide information to the digital database, resulting in more accurate LIMs. With the new procedure, the landslide positional error decreases with increasing landslide size, following a power-law. We expect that our work will help adopt standards for transferring landslide information from the aerial photographs to a digital landslide map, contributing to the production of accurate landslide maps.

  13. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three

  14. Ray mapping approach for the efficient design of continuous freeform surfaces.

    PubMed

    Bösel, Christoph; Gross, Herbert

    2016-06-27

    The efficient design of continuous freeform surfaces, which maps a given light source to an arbitrary target illumination pattern, remains a challenging problem and is considered here for collimated input beams. A common approach are ray-mapping methods, where first a ray mapping between the source and the irradiance distribution on the target plane is calculated and in a subsequent step the surface is constructed. The challenging aspect of this approach is to find an integrable mapping ensuring a continuous surface. Based on the law of reflection/refraction and an integrability condition, we derive a general condition for the surface and ray mapping for a collimated input beam. It is shown that in a small-angle approximation a proper mapping can be calculated via optimal mass transport - a mathematical framework for the calculation of a mapping between two positive density functions. We show that the surface can be constructed by solving a linear advection Eq. with appropriate boundary conditions. The results imply that the optimal mass transport mapping is approximately integrable over a wide range of distances between the freeform and the target plane and offer an efficient way to construct the surface by solving standard integrals. The efficiency is demonstrated by applying it to two challenging design examples, which shows the ability of the presented approach to handle target illumination patterns with steep irradiance gradients and numerous gray levels. PMID:27410583

  15. Approaches to digital snow mapping with LANDSAT-1 data

    NASA Technical Reports Server (NTRS)

    Itten, K. I.

    1975-01-01

    Applying the same LANDSAT-1 data to three substantially different image processing systems, a snow mapping task was performed. LARSYS Ver.3, STANSORT-2, and General Electric Image-100 did all the jobs of detecting the snowline in forested mountainous terrain, and to determine the snowcovered area. While the control and accuracy achieved with LARSYS is remarkable, time and effort to perform the processing favor the systems STANSORT and Image-100. The experiences and results demonstrate the need for a fast interactive system for operational snowmapping with multispectral satellite data.

  16. Aerial Terrain Mapping Using Unmanned Aerial Vehicle Approach

    NASA Astrophysics Data System (ADS)

    Tahar, K. N.

    2012-08-01

    This paper looks into the latest achievement in the low-cost Unmanned Aerial Vehicle (UAV) technology in their capacity to map the semi-development areas. The objectives of this study are to establish a new methodology or a new algorithm in image registration during interior orientation process and to determine the accuracy of the photogrammetric products by using UAV images. Recently, UAV technology has been used in several applications such as mapping, agriculture and surveillance. The aim of this study is to scrutinize the usage of UAV to map the semi-development areas. The performance of the low cost UAV mapping study was established on a study area with two image processing methods so that the results could be comparable. A non-metric camera was attached at the bottom of UAV and it was used to capture images at both sites after it went through several calibration steps. Calibration processes were carried out to determine focal length, principal distance, radial lens distortion, tangential lens distortion and affinity. A new method in image registration for a non-metric camera is discussed in this paper as a part of new methodology of this study. This method used the UAV Global Positioning System (GPS) onboard to register the UAV image for interior orientation process. Check points were established randomly at both sites using rapid static Global Positioning System. Ground control points are used for exterior orientation process, and check point is used for accuracy assessment of photogrammetric product. All acquired images were processed in a photogrammetric software. Two methods of image registration were applied in this study, namely, GPS onboard registration and ground control point registration. Both registrations were processed by using photogrammetric software and the result is discussed. Two results were produced in this study, which are the digital orthophoto and the digital terrain model. These results were analyzed by using the root mean square

  17. Stationkeeping Approach for the Microwave Anisotropy Probe (MAP)

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, Dave; Schiff, Conrad

    2002-01-01

    The Microwave Anisotropy Probe was successfully launched on June 30, 2001 and placed into a Lissajous orbit about the L2 Sun-Earth-Moon libration point. However, the L2 libration point is unstable which necessitates occasional stationkeeping maneuvers in order to maintain the spacecraft s Lissajous orbit. Analyses were performed in order to develop a feasible L2 stationkeeping strategy for the MAP mission. The resulting strategy meets the allotted fuel budget, allowing for enough fuel to handle additional he1 taxes, while meeting the attitude requirements for the maneuvers. Results from the first two stationkeeping maneuvers are included.

  18. An integrated approach for automated cover-type mapping of large inaccessible areas in Alaska

    USGS Publications Warehouse

    Fleming, Michael D.

    1988-01-01

    The lack of any detailed cover type maps in the state necessitated that a rapid and accurate approach to be employed to develop maps for 329 million acres of Alaska within a seven-year period. This goal has been addressed by using an integrated approach to computer-aided analysis which combines efficient use of field data with the only consistent statewide spatial data sets available: Landsat multispectral scanner data, digital elevation data derived from 1:250 000-scale maps, and 1:60 000-scale color-infrared aerial photographs.

  19. High-resolution habitat mapping on mud fields: new approach to quantitative mapping of Ocean quahog.

    PubMed

    Isachenko, Artem; Gubanova, Yana; Tzetlin, Alexander; Mokievsky, Vadim

    2014-12-01

    During 2009-2012 stocks of the bivalve Arctica islandica (Linnaeus, 1767) (Ocean quahog) in Kandalaksha Bay (the White Sea) has been assessed using a side-scan sonar, grab sampling and underwater photo imaging. Structurally uniform localities were highlighted on the basis of side-scan signal. Each type of a signal reflects combination of sediment type, microtopography and structural characteristics of benthic community. The distribution of A. islandica was the predominant factor in determining community structure. Seabed attributes considered most significant were defined for each type of substrate type. Relations of sonar signal and sediment type were used for landscape mapping based on sonar data. Community characteristics at known localities were reliably interpolated to the area of survey using statistical processing of geophysical data. A method of integrated sonar and sampling data interpretation for high-resolution mapping of A. islandica by biomass groups, benthic faunal groups and associated habitats was developed. PMID:24954748

  20. A FISH approach for mapping the human genome using Bacterial Artificial Chromosomes (BACs)

    SciTech Connect

    Hubert, R.S.; Chen, X.N.; Mitchell, S.

    1994-09-01

    As the Human Genome Project progresses, large insert cloning vectors such as BACs, P1, and P1 Artificial Chromosomes (PACs) will be required to complement the YAC mapping efforts. The value of the BAC vector for physical mapping lies in the stability of the inserts, the lack of chimerism, the length of inserts (up to 300 kb), the ability to obtain large amounts of pure clone DNA and the ease of BAC manipulation. These features helped us design two approaches for generating physical mapping reagents for human genetic studies. The first approach is a whole genome strategy in which randomly selected BACs are mapped, using FISH, to specific chromosomal bands. To date, 700 BACs have been mapped to single chromosome bands at a resolution of 2-5 Mb in addition to BACs mapped to 14 different centromeres. These BACs represent more than 90 Mb of the genome and include >70% of all human chromosome bands at the 350-band level. These data revealed that >97% of the BACs were non-chimeric and have a genomic distribution covering most gaps in the existing YAC map with excellent coverage of gene-rich regions. In the second approach, we used YACs to identify BACs on chromosome 21. A 1.5 Mb contig between D21S339 and D21S220 nears completion within the Down syndrome congenital heart disease (DS-CHD) region. Seventeen BACs ranging in size from 80 kb to 240 kb were ordered using 14 STSs with FISH confirmation. We have also used 40 YACs spanning 21q to identify, on average, >1 BAC/Mb to provide molecular cytogenetic reagents and anchor points for further mapping. The contig generated on chromosome 21 will be helpful in isolating the genes for DS-CHD. The physical mapping reagents generated using the whole genome approach will provide cytogenetic markers and mapped genomic fragments that will facilitate positional cloning efforts and the identification of genes within most chromosomal bands.

  1. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    SciTech Connect

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combination of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.

  2. Complementarity between nonstandard Higgs boson searches and precision Higgs boson measurements in the MSSM

    DOE PAGESBeta

    Carena, Marcela; Haber, Howard E.; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2015-02-03

    Precision measurements of the Higgs boson properties at the LHC provide relevant constraints on possible weak-scale extensions of the Standard Model (SM). In the context of the minimal supersymmetric Standard Model (MSSM) these constraints seem to suggest that all the additional, non-SM-like Higgs bosons should be heavy, with masses larger than about 400 GeV. This article shows that such results do not hold when the theory approaches the conditions for “alignment independent of decoupling,” where the lightest CP-even Higgs boson has SM-like tree-level couplings to fermions and gauge bosons, independently of the nonstandard Higgs boson masses. In addition, the combinationmore » of current bounds from direct Higgs boson searches at the LHC, along with the alignment conditions, have a significant impact on the allowed MSSM parameter space yielding light additional Higgs bosons. In particular, after ensuring the correct mass for the lightest CP-even Higgs boson, we find that precision measurements and direct searches are complementary and may soon be able to probe the region of non-SM-like Higgs boson with masses below the top quark pair mass threshold of 350 GeV and low to moderate values of tanβ.« less

  3. Bicomponent Trend Maps: A Multivariate Approach to Visualizing Geographic Time Series

    PubMed Central

    Schroeder, Jonathan P.

    2012-01-01

    The most straightforward approaches to temporal mapping cannot effectively illustrate all potentially significant aspects of spatio-temporal patterns across many regions and times. This paper introduces an alternative approach, bicomponent trend mapping, which employs a combination of principal component analysis and bivariate choropleth mapping to illustrate two distinct dimensions of long-term trend variations. The approach also employs a bicomponent trend matrix, a graphic that illustrates an array of typical trend types corresponding to different combinations of scores on two principal components. This matrix is useful not only as a legend for bicomponent trend maps but also as a general means of visualizing principal components. To demonstrate and assess the new approach, the paper focuses on the task of illustrating population trends from 1950 to 2000 in census tracts throughout major U.S. urban cores. In a single static display, bicomponent trend mapping is not able to depict as wide a variety of trend properties as some other multivariate mapping approaches, but it can make relationships among trend classes easier to interpret, and it offers some unique flexibility in classification that could be particularly useful in an interactive data exploration environment. PMID:23504193

  4. Engineering a robotic approach to mapping exposed volcanic fissures

    NASA Astrophysics Data System (ADS)

    Parcheta, C. E.; Parness, A.; Mitchell, K. L.

    2014-12-01

    Field geology provides a framework for advanced computer models and theoretical calculations of volcanic systems. Some field terrains, though, are poorly preserved or accessible, making documentation, quantification, and investigation impossible. Over 200 volcanologists at the 2012 Kona Chapman Conference on volcanology agreed that and important step forward in the field over the next 100 years should address the realistic size and shape of volcanic conduits. The 1969 Mauna Ulu eruption of Kīlauea provides a unique opportunity to document volcanic fissure conduits, thus, we have an ideal location to begin addressing this topic and provide data on these geometries. Exposed fissures can be mapped with robotics using machine vision. In order to test the hypothesis that fissures have irregularities with depth that will influence their fluid dynamical behavior, we must first map the fissure vents and shallow conduit to deci- or centimeter scale. We have designed, constructed, and field-tested the first version of a robotic device that will image an exposed volcanic fissure in three dimensions. The design phase included three steps: 1) create the payload harness and protective shell to prevent damage to the electronics and robot, 2) construct a circuit board to have the electronics communicate with a surface-based computer, and 3) prototype wheel shapes that can handle a variety of volcanic rock textures. The robot's mechanical parts were built using 3d printing, milling, casting and laser cutting techniques, and the electronics were assembled from off the shelf components. The testing phase took place at Mauna Ulu, Kīlauea, Hawai'i, from May 5 - 9, 2014. Many valuable design lessons were learned during the week, and the first ever 3D map from inside a volcanic fissure were successfully collected. Three vents had between 25% and 95% of their internal surfaces imaged. A fourth location, a non-eruptive crack (possibly a fault line) had two transects imaging the textures

  5. Algorithms for SU(n) boson realizations and D -functions

    NASA Astrophysics Data System (ADS)

    Dhand, Ish; Sanders, Barry C.; de Guise, Hubert

    2015-11-01

    Boson realizations map operators and states of groups to transformations and states of bosonic systems. We devise a graph-theoretic algorithm to construct the boson realizations of the canonical SU(n) basis states, which reduce the canonical subgroup chain, for arbitrary n. The boson realizations are employed to construct D -functions, which are the matrix elements of arbitrary irreducible representations, of SU(n) in the canonical basis. We demonstrate that our D -function algorithm offers significant advantage over the two competing procedures, namely, factorization and exponentiation.

  6. Endoscopic fluorescence mapping of the left atrium: A novel experimental approach for high resolution endocardial mapping in the intact heart

    PubMed Central

    Kalifa, Jérôme; Klos, Matthew; Zlochiver, Sharon; Mironov, Sergey; Tanaka, Kazuhiko; Ulahannan, Netha; Yamazaki, Masatoshi; Jalife, José; Berenfeld, Omer

    2007-01-01

    Background Despite availability of several mapping technologies to investigate the electrophysiological mechanisms of atrial fibrillation (AF), an experimental tool enabling high resolution mapping of electrical impulse on the endocardial surface of the left atrium is still lacking. Objective To present a new optical mapping approach implementing a steerable cardio-endoscope in isolated hearts. Methods The system consists of a direct or side-view endoscope coupled to a 532 nm excitation Laser for illumination, and to a CCD camera for imaging of potentiometric dye fluorescence (DI-4-ANEPPS, 80×80 pixels, 200–800 frames/sec). The cardio-endoscope was aimed successively at diverse posterior left atrial (PLA) locations to obtain high resolution movies of electrical wave propagation, as well as detailed endocardial anatomical features, in the presence and the absence of atrial stretch. Results We present several examples of high resolution endoscopic PLA recordings of wave propagation patterns during both sinus rhythm and AF with signal-to-noise ratio similar to conventional optical mapping systems. We demonstrate the endoscope’s ability to visualize highly organized AF sources (rotors) at specific locations on the PLA and PLA-pulmonary vein junctions, and present video images of waves emanating from such sources as they propagate into pectinate muscles in the LA appendage. In particular, we demonstrate this approach to be ideally suited for studying the effects of atrial stretch on AF dynamics. Conclusions In isolated hearts, cardio-endoscopic optical mapping of electrical activity should enable comprehensive evaluation of atrial fibrillatory activity in the PLA, of the role of the local anatomy on AF dynamics and of the efficacy of pharmacological and ablative interventions. PMID:17599678

  7. Slave boson theories of correlated electron systems

    SciTech Connect

    Woelfle, P.

    1995-05-01

    Slave boson theories of various models of correlated fermions are critically reviewed and several new results are presented. In the example of the Anderson impurity model the limitations of slave boson mean field theory are discussed. Self-consistent conserving approximations are compared with results obtained from the numerical renormalization group. The gauge field theory of the t-J-model is considered in the quasistatic approximation. It is shown that weak localization effects can give valuable information on the existence of gauge fields. Applications of the slave-boson approach due to Kotliar and Ruckenstein to the Hubbard model are also discussed.

  8. Conjecture Mapping: An Approach to Systematic Educational Design Research

    ERIC Educational Resources Information Center

    Sandoval, William

    2014-01-01

    Design research is strongly associated with the learning sciences community, and in the 2 decades since its conception it has become broadly accepted. Yet within and without the learning sciences there remains confusion about how to do design research, with most scholarship on the approach describing what it is rather than how to do it. This…

  9. Mind Map Marketing: A Creative Approach in Developing Marketing Skills

    ERIC Educational Resources Information Center

    Eriksson, Lars Torsten; Hauer, Amie M.

    2004-01-01

    In this conceptual article, the authors describe an alternative course structure that joins learning key marketing concepts to creative problem solving. The authors describe an approach using a convergent-divergent-convergent (CDC) process: key concepts are first derived from case material to be organized in a marketing matrix, which is then used…

  10. Constructivist-Visual Mind Map Teaching Approach and the Quality of Students' Cognitive Structures

    ERIC Educational Resources Information Center

    Dhindsa, Harkirat S.; Makarimi-Kasim; Anderson, O. Roger

    2011-01-01

    This study compared the effects of a constructivist-visual mind map teaching approach (CMA) and of a traditional teaching approach (TTA) on (a) the quality and richness of students' knowledge structures and (b) TTA and CMA students' perceptions of the extent that a constructivist learning environment (CLE) was created in their classes. The sample…

  11. Mapping.

    ERIC Educational Resources Information Center

    Kinney, Douglas M.; McIntosh, Willard L.

    1979-01-01

    The area of geological mapping in the United States in 1978 increased greatly over that reported in 1977; state geological maps were added for California, Idaho, Nevada, and Alaska last year. (Author/BB)

  12. Thermofield-based chain-mapping approach for open quantum systems

    NASA Astrophysics Data System (ADS)

    de Vega, Inés; Bañuls, Mari-Carmen

    2015-11-01

    We consider a thermofield approach to analyze the evolution of an open quantum system coupled to an environment at finite temperature. In this approach, the finite-temperature environment is exactly mapped onto two virtual environments at zero temperature. These two environments are then unitarily transformed into two different chains of oscillators, leading to a one-dimensional structure that can be numerically studied using tensor network techniques. Compared to previous approaches using a single chain mapping, our strategy offers the advantage of an exact description of the initial state at arbitrary temperatures, which results in a gain in computational efficiency and a reduced truncation error.

  13. Three-site Bose-Hubbard model subject to atom losses: Boson-pair dissipation channel and failure of the mean-field approach

    SciTech Connect

    Shchesnovich, V. S.; Mogilevtsev, D. S.

    2010-10-15

    We employ the perturbation series expansion for derivation of the reduced master equations for the three-site Bose-Hubbard model subject to strong atom losses from the central site. The model describes a condensate trapped in a triple-well potential subject to externally controlled removal of atoms. We find that the {pi}-phase state of the coherent superposition between the side wells decays via two dissipation channels, the single-boson channel (similar to the externally applied dissipation) and the boson-pair channel. The quantum derivation is compared to the classical adiabatic elimination within the mean-field approximation. We find that the boson-pair dissipation channel is not captured by the mean-field model, whereas the single-boson channel is described by it. Moreover, there is a matching condition between the zero-point energy bias of the side wells and the nonlinear interaction parameter which separates the regions where either the single-boson or the boson-pair dissipation channel dominate. Our results indicate that the M-site Bose-Hubbard models, for M>2, subject to atom losses may require an analysis which goes beyond the usual mean-field approximation for correct description of their dissipative features. This is an important result in view of the recent experimental works on the single-site addressability of condensates trapped in optical lattices.

  14. Comparison of four Vulnerability Approaches to Mapping of Shallow Aquifers of Eastern Dahomey Basin of Nigeria

    NASA Astrophysics Data System (ADS)

    Oke, Saheed; Vermeulen, Danie

    2016-04-01

    This study presents the outcome of mapping the shallow aquifers of the eastern Dahomey Basin of southwestern Nigeria vulnerability studies. The basin is a coastal transboundary aquifer extending from eastern Ghana to southwestern Nigeria. The study aimed to examine the most suitable method for mapping the basin shallow aquifers by comparing the results of four different vulnerability approaches. This is most important due to differences in vulnerability assessment parameters, approaches and results derived from most vulnerability methods on a particular aquifer. The methodology involves using vulnerability techniques that assess the intrinsic properties of the aquifer. Two methods from travel time approach (AVI and RTt) and index approach (DRASTIC and PI) were employed in the mapping of the basin. The results show the AVI has the least mapping parameters with 75% of the basin classified as very high vulnerability and 25% with high vulnerability. The DRASTIC mapping shows 18% as low vulnerability, 61% as moderate vulnerability and 21% reveal high vulnerability. Mapping with the PI method which has highest parameters shows 66% of the aquifer as low vulnerability and 34% reveal moderate vulnerability. The RTt method shows 18% as very high vulnerability, 8% as high vulnerability, 64% as moderate vulnerability and 10% reveal very low vulnerability. Further analysis involving correlation plots shows the highest correlation of 62% between the RTt and DRASTIC method than within any others methods. The analysis shows that the PI method is the mildest of all the vulnerability methods while the AVI method is the strictest of the methods considered in this vulnerability mapping. The significance of using four different approaches to the mapping of the shallow aquifers of the eastern Dahomey Basin will guide in the recommendation of the best vulnerability method for subsequent future assessment of this and other shallow aquifers. Keywords: Aquifer vulnerability, Dahomey Basin

  15. Making of 3D extinction maps from population synthesis approach

    NASA Astrophysics Data System (ADS)

    Robin, A. C.; Marshall, D.; Reylé, C.; Montillaud, J.

    Interstellar extinction is critical when studying stellar populations and Galactic structure. By taking into account all informations on stellar populations on a given line of sight, the population synthesis approach is an efficient tool to derive the distribution of extinction. This approach has been shown to give reliable estimates in regions where the stars are numerous enough and well distributed in distance. This method has some limits due to dependency on model hypotheses. With other methods, some biases can appear close to the limiting magnitude, and to the maximum distance of detection, due to detection limits of the stars which depend on the extinction itself. We present the successes of this method as well as its limitations and compare with results of other methods.

  16. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  17. Geomatics Approach for Assessment of respiratory disease Mapping

    NASA Astrophysics Data System (ADS)

    Pandey, M.; Singh, V.; Vaishya, R. C.

    2014-11-01

    Air quality is an important subject of relevance in the context of present times because air is the prime resource for sustenance of life especially human health position. Then with the aid of vast sums of data about ambient air quality is generated to know the character of air environment by utilizing technological advancements to know how well or bad the air is. This report supplies a reliable method in assessing the Air Quality Index (AQI) by using fuzzy logic. The fuzzy logic model is designed to predict Air Quality Index (AQI) that report monthly air qualities. With the aid of air quality index we can evaluate the condition of the environment of that area suitability regarding human health position. For appraisal of human health status in industrial area, utilizing information from health survey questionnaire for obtaining a respiratory risk map by applying IDW and Gettis Statistical Techniques. Gettis Statistical Techniques identifies different spatial clustering patterns like hot spots, high risk and cold spots over the entire work area with statistical significance.

  18. A Digital Soil Mapping approach using neural networks for peat depth mapping in Scotland

    NASA Astrophysics Data System (ADS)

    Aitkenhead, Matt; Saunders, Matt; Yeluripati, Jagadeesh

    2014-05-01

    Spatially explicit and accurate peat depth estimates are required for carbon stock assessment, carbon management stategies , hydrological modelling, ecosystem service assessment and land management (e.g. wind farms). In Scotland, a number of surveys have taken place over the years that have produced data on peat depth, and while many of these surveys have focussed on specific locations or peat bogs, a substantial proportion of the data produced is relatively old and has not been digitised, thus limiting its visibility and utility in new research activities, policy development and land management decision making. Here we describe ongoing work where the key objective is to integrate multiple peat survey datasets with existing spatial datasets of climate, vegetation, topography and geology. The dataset produced is generated from a small number of isolated surveys and while it is not representative of all of Scotland's soils, it is sufficient to demonstrate the conceptual basis for model development. It has been used to develop a neural network model of peat depth that has been applied across Scotland's peat bogs at 100m resolution. The resulting map gives an early indication of the variation of peat depth across the country, and allows us to produce an estimate of mean peat bog depth across the country. This estimate will improve with additional data and will contribute to improving our ability to undertake activities that depend on this kind of information. We have identified data gaps that need to be addressed in order to improve this model, in particular peat depth survey data from a wider range of peat types across the country and in particular, blanket bog and upland peat areas. Ongoing work to identify and integrate additional peat bog depth data is described. We also identify potential uses for the existing maps of peat depth, and areas of future model development.

  19. Exploring teacher's perceptions of concept mapping as a teaching strategy in science: An action research approach

    NASA Astrophysics Data System (ADS)

    Marks Krpan, Catherine Anne

    In order to promote science literacy in the classroom, students need opportunities in which they can personalize their understanding of the concepts they are learning. Current literature supports the use of concept maps in enabling students to make personal connections in their learning of science. Because they involve creating explicit connections between concepts, concept maps can assist students in developing metacognitive strategies and assist educators in identifying misconceptions in students' thinking. The literature also notes that concept maps can improve student achievement and recall. Much of the current literature focuses primarily on concept mapping at the secondary and university levels, with limited focus on the elementary panel. The research rarely considers teachers' thoughts and ideas about the concept mapping process. In order to effectively explore concept mapping from the perspective of elementary teachers, I felt that an action research approach would be appropriate. Action research enabled educators to debate issues about concept mapping and test out ideas in their classrooms. It also afforded the participants opportunities to explore their own thinking, reflect on their personal journeys as educators and play an active role in their professional development. In an effort to explore concept mapping from the perspective of elementary educators, an action research group of 5 educators and myself was established and met regularly from September 1999 until June 2000. All of the educators taught in the Toronto area. These teachers were interested in exploring how concept mapping could be used as a learning tool in their science classrooms. In summary, this study explores the journey of five educators and myself as we engaged in collaborative action research. This study sets out to: (1) Explore how educators believe concept mapping can facilitate teaching and student learning in the science classroom. (2) Explore how educators implement concept

  20. Anomalous gauge boson couplings

    SciTech Connect

    Barklow, T.; Rizzo, T.; Baur, U.

    1997-01-13

    The measurement of anomalous gauge boson self couplings is reviewed for a variety of present and planned accelerators. Sensitivities are compared for these accelerators using models based on the effective Lagrangian approach. The sensitivities described here are for measurement of {open_quotes}generic{close_quotes} parameters {kappa}{sub V}, {lambda}{sub V}, etc., defined in the text. Pre-LHC measurements will not probe these coupling parameters to precision better than O(10{sup -1}). The LHC should be sensitive to better than O(10{sup -2}), while a future NLC should achieve sensitivity of O(10{sup -3}) to O(10{sup -4}) for center of mass energies ranging from 0.5 to 1.5 TeV.

  1. Multidata remote sensing approach to regional geologic mapping in Venezuela

    SciTech Connect

    Baker, R.N.

    1996-08-01

    Remote Sensing played an important role in evaluating the exploration potential of selected lease blocks in Venezuela. Data sets used ranged from regional Landsat and airborne radar (SLAR) surveys to high-quality cloud-free air photos for local but largely inaccessible terrains. The resulting data base provided a framework for the conventional analyses of surface and subsurface information available to the project team. (1) Regional surface geology and major structural elements were interpreted from Landsat MSS imagery supplemented by TM and a regional 1:250,000 airborne radar (SLAR) survey. Evidence of dextral offset, en echelon folds and major thoroughgoing faults suggest a regional transpressional system modified by local extension and readjustment between small-scale crustal blocks. Surface expression of the major structural elements diminishes to the east, but can often be extended beneath the coastal plain by drainage anomalies and subtle geomorphic trends. (2) Environmental conditions were mapped using the high resolution airborne radar images which were used to relate vegetation types to surface texture and elevation; wetlands, outcrop and cultural features to image brightness. Additional work using multispectral TM or SPOT imagery is planned to more accurately define environmental conditions and provide a baseline for monitoring future trends. (3) Offshore oil seeps were detected using ERS-1 satellite radar (SAR) and known seeps in the Gulf of Paria as analogs. While partially successful, natural surfactants, wind shadow and a surprising variety of other phenomena created {open_quotes}false alarms{close_quotes} which required other supporting data and field sampling to verify the results. Key elements of the remote sensing analyses will be incorporated into a comprehensive geographic information (GIS) which will eventually include all of Venezuela.

  2. Flood Hazard Mapping over Large Regions using Geomorphic Approaches

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Troy, Tara J.; Manfreda, Salvatore

    2016-04-01

    Historically, man has always preferred to settle and live near the water. This tendency has not changed throughout time, and today nineteen of the twenty most populated agglomerations of the world (Demographia World Urban Areas, 2015) are located along watercourses or at the mouth of a river. On one hand, these locations are advantageous from many points of view. On the other hand, they expose significant populations and economic assets to a certain degree of flood hazard. Knowing the location and the extent of the areas exposed to flood hazards is essential to any strategy for minimizing the risk. Unfortunately, in data-scarce regions the use of traditional floodplain mapping techniques is prevented by the lack of the extensive data required, and this scarcity is generally most pronounced in developing countries. The present work aims to overcome this limitation by defining an alternative simplified procedure for a preliminary, but efficient, floodplain delineation. To validate the method in a data-rich environment, eleven flood-related morphological descriptors derived from DEMs have been used as linear binary classifiers over the Ohio River basin and its sub-catchments, measuring their performances in identifying the floodplains at the change of the topography and the size of the calibration area. The best performing classifiers among those analysed have been applied and validated across the continental U.S. The results suggest that the classifier based on the index ln(hr/H), named the Geomorphic Flood Index (GFI), is the most suitable to detect the flood-prone areas in data-scarce environments and for large-scale applications, providing good accuracy with low requirements in terms of data and computational costs. Keywords: flood hazard, data-scarce regions, large-scale studies, binary classifiers, DEM, USA.

  3. Evaluation of current statistical approaches for predictive geomorphological mapping

    NASA Astrophysics Data System (ADS)

    Miska, Luoto; Jan, Hjort

    2005-04-01

    Predictive models are increasingly used in geomorphology, but systematic evaluations of novel statistical techniques are still limited. The aim of this study was to compare the accuracy of generalized linear models (GLM), generalized additive models (GAM), classification tree analysis (CTA), neural networks (ANN) and multiple adaptive regression splines (MARS) in predictive geomorphological modelling. Five different distribution models both for non-sorted and sorted patterned ground were constructed on the basis of four terrain parameters and four soil variables. To evaluate the models, the original data set of 9997 squares of 1 ha in size was randomly divided into model training (70%, n=6998) and model evaluation sets (30%, n=2999). In general, active sorted patterned ground is clearly defined in upper fell areas with high slope angle and till soils. Active non-sorted patterned ground is more common in valleys with higher soil moisture and fine-scale concave topography. The predictive performance of each model was evaluated using the area under the receiver operating characteristic curve (AUC) and the Kappa value. The relatively high discrimination capacity of all models, AUC=0.85 0.88 and Kappa=0.49 0.56, implies that the model's predictions provide an acceptable index of sorted and non-sorted patterned ground occurrence. The best performance for model calibration data for both data sets was achieved by the CTA. However, when the predictive mapping ability was explored through the evaluation data set, the model accuracies of CTA decreased clearly compared to the other modelling techniques. For model evaluation data MARS performed marginally best. Our results show that the digital elevation model and soil data can be used to predict relatively robustly the activity of patterned ground in fine scale in a subarctic landscape. This indicates that predictive geomorphological modelling has the advantage of providing relevant and useful information on earth surface

  4. Boson representations of fermion systems: Proton-neutron systems

    NASA Astrophysics Data System (ADS)

    Sambataro, M.

    1988-05-01

    Applications of a procedure recently proposed to construct boson images of fermion Hamiltonians are shown for proton-neutron systems. First the mapping from SD fermion onto sd boson spaces is discussed and a Qπ.Qν interaction investigated. A Hermitian one-body Q boson operator is derived and analytical expressions for its coefficients are obtained. A (Qπ+Qν).(Qπ+Qν) interaction is, then, studied for particle-hole systems and the connections with the SU*(3) dynamical symmetry of the neutron-proton interacting boson model are discussed. Finally, an example of mapping from SDG onto sdg spaces is analyzed. Fermion spectra and E2 matrix elements are well reproduced in the boson spaces.

  5. Inverse field-based approach for simultaneous B₁ mapping at high fields - a phantom based study.

    PubMed

    Jin, Jin; Liu, Feng; Zuo, Zhentao; Xue, Rong; Li, Mingyan; Li, Yu; Weber, Ewald; Crozier, Stuart

    2012-04-01

    Based on computational electromagnetics and multi-level optimization, an inverse approach of attaining accurate mapping of both transmit and receive sensitivity of radiofrequency coils is presented. This paper extends our previous study of inverse methods of receptivity mapping at low fields, to allow accurate mapping of RF magnetic fields (B(1)) for high-field applications. Accurate receive sensitivity mapping is essential to image domain parallel imaging methods, such as sensitivity encoding (SENSE), to reconstruct high quality images. Accurate transmit sensitivity mapping will facilitate RF-shimming and parallel transmission techniques that directly address the RF inhomogeneity issue, arguably the most challenging issue of high-field magnetic resonance imaging (MRI). The inverse field-based approach proposed herein is based on computational electromagnetics and iterative optimization. It fits an experimental image to the numerically calculated signal intensity by iteratively optimizing the coil-subject geometry to better resemble the experiments. Accurate transmit and receive sensitivities are derived as intermediate results of the optimization process. The method is validated by imaging studies using homogeneous saline phantom at 7T. A simulation study at 300MHz demonstrates that the proposed method is able to obtain receptivity mapping with errors an order of magnitude less than that of the conventional method. The more accurate receptivity mapping and simultaneously obtained transmit sensitivity mapping could enable artefact-reduced and intensity-corrected image reconstructions. It is hoped that by providing an approach to the accurate mapping of both transmit and receive sensitivity, the proposed method will facilitate a range of applications in high-field MRI and parallel imaging. PMID:22391489

  6. Novel approaches to map small molecule-target interactions.

    PubMed

    Kapoor, Shobhna; Waldmann, Herbert; Ziegler, Slava

    2016-08-01

    The quest for small molecule perturbators of protein function or a given cellular process lies at the heart of chemical biology and pharmaceutical research. Bioactive compounds need to be extensively characterized in the context of the modulated protein(s) or process(es) in living systems to unravel and confirm their mode of action. A crucial step in this workflow is the identification of the molecular targets for these small molecules, for which a generic methodology is lacking. Herein we summarize recently developed approaches for target identification spurred by advances in omics techniques and chemo- and bioinformatics analysis. PMID:27240466

  7. Mapping Transcription Factors on Extended DNA: A Single Molecule Approach

    NASA Astrophysics Data System (ADS)

    Ebenstein, Yuval; Gassman, Natalie; Weiss, Shimon

    The ability to determine the precise loci and distribution of nucleic acid binding proteins is instrumental to our detailed understanding of cellular processes such as transcription, replication, and chromatin reorganization. Traditional molecular biology approaches and above all Chromatin immunoprecipitation (ChIP) based methods have provided a wealth of information regarding protein-DNA interactions. Nevertheless, existing techniques can only provide average properties of these interactions, since they are based on the accumulation of data from numerous protein-DNA complexes analyzed at the ensemble level. We propose a single molecule approach for direct visualization of DNA binding proteins bound specifically to their recognition sites along a long stretch of DNA such as genomic DNA. Fluorescent Quantum dots are used to tag proteins bound to DNA, and the complex is deposited on a glass substrate by extending the DNA to a linear form. The sample is then imaged optically to determine the precise location of the protein binding site. The method is demonstrated by detecting individual, Quantum dot tagged T7-RNA polymerase enzymes on the bacteriophage T7 genomic DNA and assessing the relative occupancy of the different promoters.

  8. A taxonomy of behaviour change methods: an Intervention Mapping approach

    PubMed Central

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fernández, María E.; Markham, Christine; Bartholomew, L. Kay

    2016-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it

  9. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    PubMed

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  10. Semi-automatic classification of glaciovolcanic landforms: An object-based mapping approach based on geomorphometry

    NASA Astrophysics Data System (ADS)

    Pedersen, G. B. M.

    2016-02-01

    A new object-oriented approach is developed to classify glaciovolcanic landforms (Procedure A) and their landform elements boundaries (Procedure B). It utilizes the principle that glaciovolcanic edifices are geomorphometrically distinct from lava shields and plains (Pedersen and Grosse, 2014), and the approach is tested on data from Reykjanes Peninsula, Iceland. The outlined procedures utilize slope and profile curvature attribute maps (20 m/pixel) and the classified results are evaluated quantitatively through error matrix maps (Procedure A) and visual inspection (Procedure B). In procedure A, the highest obtained accuracy is 94.1%, but even simple mapping procedures provide good results (> 90% accuracy). Successful classification of glaciovolcanic landform element boundaries (Procedure B) is also achieved and this technique has the potential to delineate the transition from intraglacial to subaerial volcanic activity in orthographic view. This object-oriented approach based on geomorphometry overcomes issues with vegetation cover, which has been typically problematic for classification schemes utilizing spectral data. Furthermore, it handles complex edifice outlines well and is easily incorporated into a GIS environment, where results can be edited or fused with other mapping results. The approach outlined here is designed to map glaciovolcanic edifices within the Icelandic neovolcanic zone but may also be applied to similar subaerial or submarine volcanic settings, where steep volcanic edifices are surrounded by flat plains.

  11. Differential Analysis of 2-D Maps by Pixel-Based Approaches.

    PubMed

    Marengo, Emilio; Robotti, Elisa; Quasso, Fabio

    2016-01-01

    Two approaches to the analysis of 2-D maps are available: the first one involves a step of spot detection on each gel image; the second one is based instead on the direct differential analysis of 2-D map images, following a pixel-based procedure. Both approaches strongly depend on the proper alignment of the gel images, but the pixel-based approach allows to solve important drawbacks of the spot-volume procedure, i.e., the problem of missing data and of overlapping spots. However, this approach is quite computationally intensive and requires the use of algorithms able to separate the information (i.e., spot-related information) from the background. Here, the most recent pixel-based approaches are described. PMID:26611422

  12. Mapping paths: new approaches to dissect eukaryotic signaling circuitry

    PubMed Central

    Mutlu, Nebibe; Kumar, Anuj

    2016-01-01

    Eukaryotic cells are precisely “wired” to coordinate changes in external and intracellular signals with corresponding adjustments in the output of complex and often interconnected signaling pathways. These pathways are critical in understanding cellular growth and function, and several experimental trends are emerging with applicability toward more fully describing the composition and topology of eukaryotic signaling networks. In particular, recent studies have implemented CRISPR/Cas-based screens in mouse and human cell lines for genes involved in various cell growth and disease phenotypes. Proteomic methods using mass spectrometry have enabled quantitative and dynamic profiling of protein interactions, revealing previously undiscovered complexes and allele-specific protein interactions. Methods for the single-cell study of protein localization and gene expression have been integrated with computational analyses to provide insight into cell signaling in yeast and metazoans. In this review, we present an overview of exemplary studies using the above approaches, relevant for the analysis of cell signaling and indeed, more broadly, for many modern biological applications. PMID:27540473

  13. Mapping water quality and substrate cover in optically complex coastal and reef waters: an integrated approach.

    PubMed

    Phinn, S R; Dekker, A G; Brando, V E; Roelfsema, C M

    2005-01-01

    Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs. PMID:15757744

  14. Force scanning: A rapid, high-resolution approach for spatial mechanical property mapping

    PubMed Central

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the micro-and nano-scales is force mapping, which involves taking individual force curves at discrete sites across a region of interest. Limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straight-forward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue. PMID:21411911

  15. Benthic habitat mapping in a Portuguese Marine Protected Area using EUNIS: An integrated approach

    NASA Astrophysics Data System (ADS)

    Henriques, Victor; Guerra, Miriam Tuaty; Mendes, Beatriz; Gaudêncio, Maria José; Fonseca, Paulo

    2015-06-01

    A growing demand for seabed and habitat mapping has taken place over the past years to support the maritime integrated policies at EU and national levels aiming at the sustainable use of sea resources. This study presents the results of applying the hierarchical European Nature Information System (EUNIS) to classify and map the benthic habitats of the Luiz Saldanha Marine Park, a marine protected area (MPA), located in the mainland Portuguese southwest coast, in the Iberian Peninsula. The habitat map was modelled by applying a methodology based on EUNIS to merge biotic and abiotic key habitat drivers. The modelling in this approach focused on predicting the association of different data types: substrate, bathymetry, light intensity, waves and currents energy, sediment grain size and benthic macrofauna into a common framework. The resulting seamless medium scale habitat map discriminates twenty six distinct sublittoral habitats, including eight with no match in the current classification, which may be regarded as new potential habitat classes and therefore will be submitted to EUNIS. A discussion is provided examining the suitability of the current EUNIS scheme as a standardized approach to classify marine benthic habitats and map their spatial distribution at medium scales in the Portuguese coast. In addition the factors that most affected the results available in the predictive habitat map and the role of the environmental factors on macrofaunal assemblage composition and distribution are outlined.

  16. Atom-atom correlations in time-of-flight imaging of ultracold bosons in optical lattices

    SciTech Connect

    Zaleski, T. A.; Kopec, T. K.

    2011-11-15

    We study the spatial correlations of strongly interacting bosons in a ground state, confined in a two-dimensional square and a three-dimensional cubic lattice. Using the combined Bogoliubov method and the quantum rotor approach, we map the Hamiltonian of strongly interacting bosons onto U(1) phase action in order to calculate the atom-atom correlations' decay along the principal axis and a diagonal of the lattice-plane direction as a function of distance. Lower tunneling rates lead to quicker decays of the correlations, whose character becomes exponential. Finally, correlation functions allow us to calculate quantities that are directly bound to experimental outcomes, namely time-of-flight absorption images and resulting visibility. Our results contain all the characteristic features present in experimental data (transition from Mott insulating blob to superfluid peaks, etc.), emphasizing the usability of the proposed approach.

  17. Quantum criticality in disordered bosonic optical lattices

    SciTech Connect

    Cai Xiaoming; Chen Shu; Wang Yupeng

    2011-04-15

    Using the exact Bose-Fermi mapping, we study universal properties of ground-state density distributions and finite-temperature quantum critical behavior of one-dimensional hard-core bosons in trapped incommensurate optical lattices. Through the analysis of universal scaling relations in the quantum critical regime, we demonstrate that the superfluid-to-Bose-glass transition and the general phase diagram of disordered hard-core bosons can be uniquely determined from finite-temperature density distributions of the trapped disordered system.

  18. A new approach to mapping permafrost and change incorporating uncertainties in ground conditions and climate projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Olthof, I.; Fraser, R.; Wolfe, S. A.

    2014-11-01

    Spatially detailed information on permafrost distribution and change with climate is important for land use planning, infrastructure development, and environmental assessments. However, the required soil and surficial geology maps in the North are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land cover type, field observations are used to estimate the probabilities of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium, and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined. We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness, and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  19. A New Approach to Mapping Permafrost and Change Incorporating Uncertainties in Ground Conditions and Climate Projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.

    2014-12-01

    Spatially detailed information on permafrost distribution and change with climate is important for land-use planning, infrastructure development and environmental assessments. However, the required soil and surficial geology maps in the North are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land-cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land-cover type, field observations are used to estimate the probability of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined (Fig. 1). We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  20. A new approach to mapping permafrost and change incorporating uncertainties in ground conditions and climate projections

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Olthof, I.; Fraser, R.; Wolfe, S. A.

    2014-04-01

    Spatially detailed information on permafrost distribution and change with climate is important for land-use planning and for environmental and ecological assessments. However, the required soil and surficial geology maps in the north are coarse, and projected climate scenarios vary widely. Considering these uncertainties, we propose a new approach to mapping permafrost distribution and change by integrating remote sensing data, field measurements, and a process-based model. Land-cover types from satellite imagery are used to capture the general land conditions and to improve the resolution of existing permafrost maps. For each land-cover type, field observations are used to estimate the probability of different ground conditions. A process-based model is used to quantify the evolution of permafrost for each ground condition under three representative climate scenarios (low, medium and high warming). From the model results, the probability of permafrost occurrence and the most likely permafrost conditions are determined. We apply this approach at 20 m resolution to a large area in Northwest Territories, Canada. Mapped permafrost conditions are in agreement with field observations and other studies. The data requirements, model robustness and computation time are reasonable, and this approach may serve as a practical means to mapping permafrost and changes at high resolution in other regions.

  1. An effective trace-guided wavefront navigation and map-building approach for autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Luo, Chaomin; Krishnan, Mohan; Paulik, Mark; Jan, Gene Eu

    2013-12-01

    This paper aims to address a trace-guided real-time navigation and map building approach of an autonomous mobile robot. Wave-front based global path planner is developed to generate a global trajectory for an autonomous mobile robot. Modified Vector Field Histogram (M-VFH) is employed based on the LIDAR sensor information to guide the robot locally to be autonomously traversed with obstacle avoidance by following traces provided by the global path planner. A local map composed of square grids is created through the local navigator while the robot traverses with limited LIDAR sensory information. From the measured sensory information, a map of the robot's immediate limited surroundings is dynamically built for the robot navigation. The real-time wave-front based navigation and map building methodology has been successfully demonstrated in a Player/Stage simulation environment. With the wave-front-based global path planner and M-VFH local navigator, a safe, short, and reasonable trajectory is successfully planned in a majority of situations without any templates, without explicitly optimizing any global cost functions, and without any learning procedures. Its effectiveness, feasibility, efficiency and simplicity of the proposed real-time navigation and map building of an autonomous mobile robot have been successfully validated by simulation and comparison studies. Comparison studies of the proposed approach with the other path planning approaches demonstrate that the proposed method is capable of planning more reasonable and shorter collision-free trajectories autonomously.

  2. A new computer approach to map mixed forest features and postprocess multispectral data

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1976-01-01

    A computer technique for mapping mixed softwood and hardwood stands in multispectral satellite imagery of forest regions is described. The purpose of the technique is to obtain smoother resource maps useful in timber harvesting operations. The computer program relies on an algorithm which assesses the size and similarity of adjacent sections on satellite imagery (Landsat-1 data is used) and constructs, through an iteration of the basic algorithm, a more general map of timber mixtures, eliminating the mottled appearance of the raw imagery. Despite difficulties in the experimental analysis of a Texas forest, apparently due to relatively low resolution of the Landsat data, the computer classification approach outlined is suggested as a generally applicable method of creating serviceable maps from multispectral imagery.

  3. Integrated environmental mapping and monitoring, a methodological approach to optimise knowledge gathering and sampling strategy.

    PubMed

    Nilssen, Ingunn; Ødegård, Øyvind; Sørensen, Asgeir J; Johnsen, Geir; Moline, Mark A; Berge, Jørgen

    2015-07-15

    New technology has led to new opportunities for a holistic environmental monitoring approach adjusted to purpose and object of interest. The proposed integrated environmental mapping and monitoring (IEMM) concept, presented in this paper, describes the different steps in such a system from mission of survey to selection of parameters, sensors, sensor platforms, data collection, data storage, analysis and to data interpretation for reliable decision making. The system is generic; it can be used by authorities, industry and academia and is useful for planning- and operational phases. In the planning process the systematic approach is also ideal to identify areas with gap of knowledge. The critical stages of the concept is discussed and exemplified by two case studies, one environmental mapping and one monitoring case. As an operational system, the IEMM concept can contribute to an optimised integrated environmental mapping and monitoring for knowledge generation as basis for decision making. PMID:25956441

  4. Two-dimensional thermofield bosonization

    SciTech Connect

    Amaral, R.L.P.G.

    2005-12-15

    The main objective of this paper was to obtain an operator realization for the bosonization of fermions in 1 + 1 dimensions, at finite, non-zero temperature T. This is achieved in the framework of the real-time formalism of Thermofield Dynamics. Formally, the results parallel those of the T = 0 case. The well-known two-dimensional Fermion-Boson correspondences at zero temperature are shown to hold also at finite temperature. To emphasize the usefulness of the operator realization for handling a large class of two-dimensional quantum field-theoretic problems, we contrast this global approach with the cumbersome calculation of the fermion-current two-point function in the imaginary-time formalism and real-time formalisms. The calculations also illustrate the very different ways in which the transmutation from Fermi-Dirac to Bose-Einstein statistics is realized.

  5. Does Constructivist Approach Applicable through Concept Maps to Achieve Meaningful Learning in Science?

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This study deals with the application of constructivist approach through individual and cooperative modes of spider and hierarchical concept maps to achieve meaningful learning on science concepts (e.g. acids, bases & salts, physical and chemical changes). The main research questions were: Q (1): is there any difference in individual and…

  6. A Soft OR Approach to Fostering Systems Thinking: SODA Maps plus Joint Analytical Process

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2016-01-01

    Higher order thinking skills are important for managers. Systems thinking is an important type of higher order thinking in business education. This article investigates a soft Operations Research approach to teaching and learning systems thinking. It outlines the integrative use of Strategic Options Development and Analysis maps for visualizing…

  7. Shape-to-String Mapping: A Novel Approach to Clustering Time-Index Biomics Data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Herein we describe a qualitative approach for clustering time-index biomics data. The data are transformed into angles from the intensity-ratios between adjacent time-points. A code is used to map a qualitative representation of the numerical time-index data which captures the features in the data ...

  8. Concept Maps in the Classroom: A New Approach to Reveal Students' Conceptual Change

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Liefländer, Anne K.; Bogner, Franz X.

    2015-01-01

    When entering the classroom, adolescents already hold various conceptions on science topics. Concept maps may function as useful tools to reveal such conceptions although labor-intensive analysis often prevents application in typical classroom situations. The authors aimed to provide teachers with an appropriate approach to analyze students'…

  9. The Criterion-Related Validity of a Computer-Based Approach for Scoring Concept Maps

    ERIC Educational Resources Information Center

    Clariana, Roy B.; Koul, Ravinder; Salehi, Roya

    2006-01-01

    This investigation seeks to confirm a computer-based approach that can be used to score concept maps (Poindexter & Clariana, 2004) and then describes the concurrent criterion-related validity of these scores. Participants enrolled in two graduate courses (n=24) were asked to read about and research online the structure and function of the heart…

  10. Determination of contact maps in proteins: A combination of structural and chemical approaches

    NASA Astrophysics Data System (ADS)

    Wołek, Karol; Gómez-Sicilia, Àngel; Cieplak, Marek

    2015-12-01

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  11. Mapping the genomic architecture of adaptive traits with interspecific introgressive origin: a coalescent-based approach.

    PubMed

    Hejase, Hussein A; Liu, Kevin J

    2016-01-01

    Recent studies of eukaryotes including human and Neandertal, mice, and butterflies have highlighted the major role that interspecific introgression has played in adaptive trait evolution. A common question arises in each case: what is the genomic architecture of the introgressed traits? One common approach that can be used to address this question is association mapping, which looks for genotypic markers that have significant statistical association with a trait. It is well understood that sample relatedness can be a confounding factor in association mapping studies if not properly accounted for. Introgression and other evolutionary processes (e.g., incomplete lineage sorting) typically introduce variation among local genealogies, which can also differ from global sample structure measured across all genomic loci. In contrast, state-of-the-art association mapping methods assume fixed sample relatedness across the genome, which can lead to spurious inference. We therefore propose a new association mapping method called Coal-Map, which uses coalescent-based models to capture local genealogical variation alongside global sample structure. Using simulated and empirical data reflecting a range of evolutionary scenarios, we compare the performance of Coal-Map against EIGENSTRAT, a leading association mapping method in terms of its popularity, power, and type I error control. Our empirical data makes use of hundreds of mouse genomes for which adaptive interspecific introgression has recently been described. We found that Coal-Map's performance is comparable or better than EIGENSTRAT in terms of statistical power and false positive rate. Coal-Map's performance advantage was greatest on model conditions that most closely resembled empirically observed scenarios of adaptive introgression. These conditions had: (1) causal SNPs contained in one or a few introgressed genomic loci and (2) varying rates of gene flow - from high rates to very low rates where incomplete lineage

  12. Determination of contact maps in proteins: A combination of structural and chemical approaches

    SciTech Connect

    Wołek, Karol; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones — a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles.

  13. Determination of contact maps in proteins: A combination of structural and chemical approaches.

    PubMed

    Wołek, Karol; Gómez-Sicilia, Àngel; Cieplak, Marek

    2015-12-28

    Contact map selection is a crucial step in structure-based molecular dynamics modelling of proteins. The map can be determined in many different ways. We focus on the methods in which residues are represented as clusters of effective spheres. One contact map, denoted as overlap (OV), is based on the overlap of such spheres. Another contact map, named Contacts of Structural Units (CSU), involves the geometry in a different way and, in addition, brings chemical considerations into account. We develop a variant of the CSU approach in which we also incorporate Coulombic effects such as formation of the ionic bridges and destabilization of possible links through repulsion. In this way, the most essential and well defined contacts are identified. The resulting residue-residue contact map, dubbed repulsive CSU (rCSU), is more sound in its physico-chemical justification than CSU. It also provides a clear prescription for validity of an inter-residual contact: the number of attractive atomic contacts should be larger than the number of repulsive ones - a feature that is not present in CSU. However, both of these maps do not correlate well with the experimental data on protein stretching. Thus, we propose to use rCSU together with the OV map. We find that the combined map, denoted as OV+rCSU, performs better than OV. In most situations, OV and OV+rCSU yield comparable folding properties but for some proteins rCSU provides contacts which improve folding in a substantial way. We discuss the likely residue-specificity of the rCSU contacts. Finally, we make comparisons to the recently proposed shadow contact map, which is derived from different principles. PMID:26723590

  14. Supersymmetric Higgs Bosons in Weak Boson Fusion

    SciTech Connect

    Hollik, Wolfgang; Plehn, Tilman; Rauch, Michael; Rzehak, Heidi

    2009-03-06

    We compute the complete supersymmetric next-to-leading-order corrections to the production of a light Higgs boson in weak-boson fusion. The size of the electroweak corrections is of similar order as the next-to-leading-order corrections in the standard model. The supersymmetric QCD corrections turn out to be significantly smaller than expected and than their electroweak counterparts. These corrections are an important ingredient to a precision analysis of the (supersymmetric) Higgs sector at the LHC, either as a known correction factor or as a contribution to the theory error.

  15. A hierarchical Bayesian-MAP approach to inverse problems in imaging

    NASA Astrophysics Data System (ADS)

    Raj, Raghu G.

    2016-07-01

    We present a novel approach to inverse problems in imaging based on a hierarchical Bayesian-MAP (HB-MAP) formulation. In this paper we specifically focus on the difficult and basic inverse problem of multi-sensor (tomographic) imaging wherein the source object of interest is viewed from multiple directions by independent sensors. Given the measurements recorded by these sensors, the problem is to reconstruct the image (of the object) with a high degree of fidelity. We employ a probabilistic graphical modeling extension of the compound Gaussian distribution as a global image prior into a hierarchical Bayesian inference procedure. Since the prior employed by our HB-MAP algorithm is general enough to subsume a wide class of priors including those typically employed in compressive sensing (CS) algorithms, HB-MAP algorithm offers a vehicle to extend the capabilities of current CS algorithms to include truly global priors. After rigorously deriving the regression algorithm for solving our inverse problem from first principles, we demonstrate the performance of the HB-MAP algorithm on Monte Carlo trials and on real empirical data (natural scenes). In all cases we find that our algorithm outperforms previous approaches in the literature including filtered back-projection and a variety of state-of-the-art CS algorithms. We conclude with directions of future research emanating from this work.

  16. Toward real-time three-dimensional mapping of surficial aquifers using a hybrid modeling approach

    NASA Astrophysics Data System (ADS)

    Friedel, Michael J.; Esfahani, Akbar; Iwashita, Fabio

    2016-02-01

    A hybrid modeling approach is proposed for near real-time three-dimensional (3D) mapping of surficial aquifers. First, airborne frequency-domain electromagnetic (FDEM) measurements are numerically inverted to obtain subsurface resistivities. Second, a machine-learning (ML) algorithm is trained using the FDEM measurements and inverted resistivity profiles, and borehole geophysical and hydrogeologic data. Third, the trained ML algorithm is used together with independent FDEM measurements to map the spatial distribution of the aquifer system. Efficacy of the hybrid approach is demonstrated for mapping a heterogeneous surficial aquifer and confining unit in northwestern Nebraska, USA. For this case, independent performance testing reveals that aquifer mapping is unbiased with a strong correlation (0.94) among numerically inverted and ML-estimated binary (clay-silt or sand-gravel) layer resistivities (5-20 ohm-m or 21-5,000 ohm-m), and an intermediate correlation (0.74) for heterogeneous (clay, silt, sand, gravel) layer resistivities (5-5,000 ohm-m). Reduced correlation for the heterogeneous model is attributed to over-estimating the under-sampled high-resistivity gravels (about 0.5 % of the training data), and when removed the correlation increases (0.87). Independent analysis of the numerically inverted and ML-estimated resistivities finds that the hybrid procedure preserves both univariate and spatial statistics for each layer. Following training, the algorithms can map 3D surficial aquifers as fast as leveled FDEM measurements are presented to the ML network.

  17. Engineering geological mapping in Wallonia (Belgium) : present state and recent computerized approach

    NASA Astrophysics Data System (ADS)

    Delvoie, S.; Radu, J.-P.; Ruthy, I.; Charlier, R.

    2012-04-01

    An engineering geological map can be defined as a geological map with a generalized representation of all the components of a geological environment which are strongly required for spatial planning, design, construction and maintenance of civil engineering buildings. In Wallonia (Belgium) 24 engineering geological maps have been developed between the 70s and the 90s at 1/5,000 or 1/10,000 scale covering some areas of the most industrialized and urbanized cities (Liège, Charleroi and Mons). They were based on soil and subsoil data point (boring, drilling, penetration test, geophysical test, outcrop…). Some displayed data present the depth (with isoheights) or the thickness (with isopachs) of the different subsoil layers up to about 50 m depth. Information about geomechanical properties of each subsoil layer, useful for engineers and urban planners, is also synthesized. However, these maps were built up only on paper and progressively needed to be updated with new soil and subsoil data. The Public Service of Wallonia and the University of Liège have recently initiated a study to evaluate the feasibility to develop engineering geological mapping with a computerized approach. Numerous and various data (about soil and subsoil) are stored into a georelational database (the geotechnical database - using Access, Microsoft®). All the data are geographically referenced. The database is linked to a GIS project (using ArcGIS, ESRI®). Both the database and GIS project consist of a powerful tool for spatial data management and analysis. This approach involves a methodology using interpolation methods to update the previous maps and to extent the coverage to new areas. The location (x, y, z) of each subsoil layer is then computed from data point. The geomechanical data of these layers are synthesized in an explanatory booklet joined to maps.

  18. Improved spatial accuracy of functional maps in the rat olfactory bulb using supervised machine learning approach.

    PubMed

    Murphy, Matthew C; Poplawsky, Alexander J; Vazquez, Alberto L; Chan, Kevin C; Kim, Seong-Gi; Fukuda, Mitsuhiro

    2016-08-15

    Functional MRI (fMRI) is a popular and important tool for noninvasive mapping of neural activity. As fMRI measures the hemodynamic response, the resulting activation maps do not perfectly reflect the underlying neural activity. The purpose of this work was to design a data-driven model to improve the spatial accuracy of fMRI maps in the rat olfactory bulb. This system is an ideal choice for this investigation since the bulb circuit is well characterized, allowing for an accurate definition of activity patterns in order to train the model. We generated models for both cerebral blood volume weighted (CBVw) and blood oxygen level dependent (BOLD) fMRI data. The results indicate that the spatial accuracy of the activation maps is either significantly improved or at worst not significantly different when using the learned models compared to a conventional general linear model approach, particularly for BOLD images and activity patterns involving deep layers of the bulb. Furthermore, the activation maps computed by CBVw and BOLD data show increased agreement when using the learned models, lending more confidence to their accuracy. The models presented here could have an immediate impact on studies of the olfactory bulb, but perhaps more importantly, demonstrate the potential for similar flexible, data-driven models to improve the quality of activation maps calculated using fMRI data. PMID:27236085

  19. Mapping susceptibility of rainfall-triggered shallow landslides using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Liu, Chia-Nan; Wu, Chia-Chen

    2008-08-01

    To prepare a landslide susceptibility map is essential to identify hazardous regions, construct appropriate mitigation facilities, and plan emergency measures for a region prone to landslides triggered by rainfall. The conventional mapping methods require much information about past landslides records and contributing terrace and rainfall. They also rely heavily on the quantity and quality of accessible information and subjectively of the map builder. This paper contributes to a systematic and quantitative assessment of mapping landslide hazards over a region. Geographical Information System is implemented to retrieve relevant parameters from data layers, including the spatial distribution of transient fluid pressures, which is estimated using the TRIGRS program. The factor of safety of each pixel in the study region is calculated analytically. Monte Carlo simulation of random variables is conducted to process the estimation of fluid pressure and factor of safety for multiple times. The failure probability of each pixel is thus estimated. These procedures of mapping landslide potential are demonstrated in a case history. The analysis results reveal a positive correlation between landslide probability and accumulated rainfall. This approach gives simulation results compared to field records. The location and size of actual landslide are well predicted. An explanation for some of the inconsistencies is also provided to emphasize the importance of site information on the accuracy of mapping results.

  20. Single-molecule approach to bacterial genomic comparisons via optical mapping.

    SciTech Connect

    Zhou, Shiguo; Kile, A.; Bechner, M.; Kvikstad, E.; Deng, W.; Wei, J.; Severin, J.; Runnheim, R.; Churas, C.; Forrest, D.; Dimalanta, E.; Lamers, C.; Burland, V.; Blattner, F. R.; Schwartz, David C.

    2004-01-01

    Modern comparative genomics has been established, in part, by the sequencing and annotation of a broad range of microbial species. To gain further insights, new sequencing efforts are now dealing with the variety of strains or isolates that gives a species definition and range; however, this number vastly outstrips our ability to sequence them. Given the availability of a large number of microbial species, new whole genome approaches must be developed to fully leverage this information at the level of strain diversity that maximize discovery. Here, we describe how optical mapping, a single-molecule system, was used to identify and annotate chromosomal alterations between bacterial strains represented by several species. Since whole-genome optical maps are ordered restriction maps, sequenced strains of Shigella flexneri serotype 2a (2457T and 301), Yersinia pestis (CO 92 and KIM), and Escherichia coli were aligned as maps to identify regions of homology and to further characterize them as possible insertions, deletions, inversions, or translocations. Importantly, an unsequenced Shigella flexneri strain (serotype Y strain AMC[328Y]) was optically mapped and aligned with two sequenced ones to reveal one novel locus implicated in serotype conversion and several other loci containing insertion sequence elements or phage-related gene insertions. Our results suggest that genomic rearrangements and chromosomal breakpoints are readily identified and annotated against a prototypic sequenced strain by using the tools of optical mapping.

  1. Using Concept Mapping in Community-Based Participatory Research: A Mixed Methods Approach

    PubMed Central

    Windsor, Liliane Cambraia

    2015-01-01

    Community-based participatory research (CBPR) has been identified as a useful approach to increasing community involvement in research. Developing rigorous methods in conducting CBPR is an important step in gaining more support for this approach. The current article argues that concept mapping, a structured mixed methods approach, is useful in the initial development of a rigorous CBPR program of research aiming to develop culturally tailored and community-based health interventions for vulnerable populations. A research project examining social dynamics and consequences of alcohol and substance use in Newark, New Jersey, is described to illustrate the use of concept mapping methodology in CBPR. A total of 75 individuals participated in the study. PMID:26561484

  2. Large-extent digital soil mapping approaches for total soil depth

    NASA Astrophysics Data System (ADS)

    Mulder, Titia; Lacoste, Marine; Saby, Nicolas P. A.; Arrouays, Dominique

    2015-04-01

    Total soil depth (SDt) plays a key role in supporting various ecosystem services and properties, including plant growth, water availability and carbon stocks. Therefore, predictive mapping of SDt has been included as one of the deliverables within the GlobalSoilMap project. In this work SDt was predicted for France following the directions of GlobalSoilMap, which requires modelling at 90m resolution. This first method, further referred to as DM, consisted of modelling the deterministic trend in SDt using data mining, followed by a bias correction and ordinary kriging of the residuals. Considering the total surface area of France, being about 540K km2, employed methods may need to be able dealing with large data sets. Therefore, a second method, multi-resolution kriging (MrK) for large datasets, was implemented. This method consisted of modelling the deterministic trend by a linear model, followed by interpolation of the residuals. For the two methods, the general trend was assumed to be explained by the biotic and abiotic environmental conditions, as described by the Soil-Landscape paradigm. The mapping accuracy was evaluated by an internal validation and its concordance with previous soil maps. In addition, the prediction interval for DM and the confidence interval for MrK were determined. Finally, the opportunities and limitations of both approaches were evaluated. The results showed consistency in mapped spatial patterns and a good prediction of the mean values. DM was better capable in predicting extreme values due to the bias correction. Also, DM was more powerful in capturing the deterministic trend than the linear model of the MrK approach. However, MrK was found to be more straightforward and flexible in delivering spatial explicit uncertainty measures. The validation indicated that DM was more accurate than MrK. Improvements for DM may be expected by predicting soil depth classes. MrK shows potential for modelling beyond the country level, at high

  3. Flood inundation mapping uncertainty introduced by topographic data accuracy, geometric configuration and modeling approach

    NASA Astrophysics Data System (ADS)

    Papaioannou, G.; Loukas, Athanasios

    2010-05-01

    Floodplain modeling is a recently new and applied method in river engineering discipline and is essential for prediction of flood hazards. The issue of flood inundation of upland environments with topographically complex floodplains is an understudied subject. In most areas of the U.S.A., the use of topographic information derived from Light Detection and Ranging (LIDAR) has improved the quality of river flood inundation predictions. However, such high quality topographical data are not available in most countries and the necessary information is obtained by topographical survey and/or topographical maps. Furthermore, the optimum dimensionality of hydraulic models, cross-section configuration in one-dimensional (1D) models, mesh resolution in two-dimensional models (2D) and modeling approach is not well studied or documented. All these factors introduce significant uncertainty in the evaluation of the floodplain zoning. This study addresses some of these issues by comparing flood inundation maps developed using different topography, geometric description and modeling approach. The methodology involves use of topographic datasets with different horizontal resolutions, vertical accuracies and bathymetry details. Each topographic dataset is used to create a flood inundation map for different cross-section configurations using 1D (HEC-RAS) model, and different mesh resolutions using 2D models for steady state and unsteady state conditions. Comparison of resulting maps indicates the uncertainty introduced in floodplain modeling by the horizontal resolution and vertical accuracy of topographic data and the different modeling approaches.

  4. DREAM--a novel approach for robust, ultrafast, multislice B₁ mapping.

    PubMed

    Nehrke, Kay; Börnert, Peter

    2012-11-01

    A novel multislice B₁-mapping method dubbed dual refocusing echo acquisition mode is proposed, able to cover the whole transmit coil volume in only one second, which is more than an order of magnitude faster than established approaches. The dual refocusing echo acquisition mode technique employs a stimulated echo acquisition mode (STEAM) preparation sequence followed by a tailored single-shot gradient echo sequence, measuring simultaneously the stimulated echo and the free induction decay as gradient-recalled echoes, and determining the actual flip angle of the STEAM preparation radiofrequency pulses from the ratio of the two measured signals. Due to an elaborated timing scheme, the method is insensitive against susceptibility/chemical shift effects and can deliver a B₀ phase map and a transceive phase map for free. The approach has only a weak T₁ and T₂ dependence and moreover, causes only a low specific absorption rate (SAR) burden. The accuracy of the method with respect to systematic and statistical errors is investigated both, theoretically and in experiments on phantoms. In addition, the performance of the approach is demonstrated in vivo in B₁-mapping and radiofrequency shimming experiments on the abdomen, the legs, and the head on an eight-channel parallel transmit 3 T MRI system. PMID:22252850

  5. Mutual Composite Fermion and Composite Boson approaches to balanced and imbalanced bi-layer quantum Hall system: An electronic analogy of the Helium 4 system

    SciTech Connect

    Ye Jinwu

    2008-03-15

    We use both Mutual Composite Fermion (MCF) and Composite Boson (CB) approach to study balanced and imbalanced Bi-layer Quantum Hall systems (BLQH) and make critical comparisons between the two approaches. We find the CB approach is superior to the MCF approach in studying ground states with different kinds of broken symmetries. In the phase representation of the CB theory, we first study the Excitonic superfluid (ESF) state. The theory puts spin and charge degree freedoms in the same footing, explicitly bring out the spin-charge connection and classify all the possible excitations in a systematic way. Then in the dual density representation of the CB theory, we study possible intermediate phases as the distance increases. We propose there are two critical distances d{sub c1} < d{sub c2} and three phases as the distance increases. When 0 < d < d{sub c1}, the system is in the ESF state which breaks the internal U(1) symmetry, when d{sub c1} < d < d{sub c2}, the system is in an pseudo-spin density wave (PSDW) state which breaks the translational symmetry, there is a first-order transition at d{sub c1} driven by the collapsing of magneto-roton minimum at a finite wavevector in the pseudo-spin channel. When d{sub c2} < d < {infinity}, the system becomes two weakly coupled {nu} = 1/2 Composite Fermion Fermi Liquid (FL) state. There is also a first-order transition at d = d{sub c2}. We construct a quantum Ginzburg Landau action to describe the transition from ESF to PSDW which break the two completely different symmetries. By using the QGL action, we explicitly show that the PSDW takes a square lattice and analyze in detail the properties of the PSDW at zero and finite temperature. We also suggest that the correlated hopping of vacancies in the active and passive layers in the PSDW state leads to very large and temperature-dependent drag consistent with the experimental data. Then we study the effects of imbalance on both ESF and PSDW. In the ESF side, the system supports

  6. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  7. An Integrative Network Approach to Map the Transcriptome to the Phenome

    PubMed Central

    Mehan, Michael R.; Nunez-Iglesias, Juan; Kalakrishnan, Mrinal; Waterman, Michael S.

    2009-01-01

    Abstract Although many studies have been successful in the discovery of cooperating groups of genes, mapping these groups to phenotypes has proved a much more challenging task. In this article, we present the first genome-wide mapping of gene coexpression modules onto the phenome. We annotated coexpression networks from 136 microarray datasets with phenotypes from the Unified Medical Language System (UMLS). We then designed an efficient graph-based simulated annealing approach to identify coexpression modules frequently and specifically occurring in datasets related to individual phenotypes. By requiring phenotype-specific recurrence, we ensure the robustness of our findings. We discovered 118,772 modules specific to 42 phenotypes, and developed validation tests combining Gene Ontology, GeneRIF and UMLS. Our method is generally applicable to any kind of abundant network data with defined phenotype association, and thus paves the way for genome-wide, gene network-phenotype maps. PMID:19630539

  8. Mapping quantitative trait loci in complex pedigrees: a two-step variance component approach.

    PubMed Central

    George, A W; Visscher, P M; Haley, C S

    2000-01-01

    There is a growing need for the development of statistical techniques capable of mapping quantitative trait loci (QTL) in general outbred animal populations. Presently used variance component methods, which correctly account for the complex relationships that may exist between individuals, are challenged by the difficulties incurred through unknown marker genotypes, inbred individuals, partially or unknown marker phases, and multigenerational data. In this article, a two-step variance component approach that enables practitioners to routinely map QTL in populations with the aforementioned difficulties is explored. The performance of the QTL mapping methodology is assessed via its application to simulated data. The capacity of the technique to accurately estimate parameters is examined for a range of scenarios. PMID:11102397

  9. A practical and automated approach to large area forest disturbance mapping with remote sensing.

    PubMed

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  10. Automated mapping of glacial overdeepenings beneath contemporary ice sheets: Approaches and potential applications

    NASA Astrophysics Data System (ADS)

    Patton, Henry; Swift, Darrel A.; Clark, Chris D.; Livingstone, Stephen J.; Cook, Simon J.; Hubbard, Alun

    2015-03-01

    Awareness is growing on the significance of overdeepenings in ice sheet systems. However, a complete understanding of overdeepening formation is lacking, meaning observations of overdeepening location and morphometry are urgently required to motivate process understanding. Subject to the development of appropriate mapping approaches, high resolution subglacial topography data sets covering the whole of Antarctica and Greenland offer significant potential to acquire such observations and to relate overdeepening characteristics to ice sheet parameters. We explore a possible method for mapping overdeepenings beneath the Antarctic and Greenland ice sheets and illustrate a potential application of this approach by testing a possible relationship between overdeepening elongation ratio and ice sheet flow velocity. We find that hydrological and terrain filtering approaches are unsuited to mapping overdeepenings and develop a novel rule-based GIS methodology that delineates overdeepening perimeters by analysis of closed-contour properties. We then develop GIS procedures that provide information on overdeepening morphology and topographic context. Limitations in the accuracy and resolution of bed-topography data sets mean that application to glaciological problems requires consideration of quality-control criteria to (a) remove potentially spurious depressions and (b) reduce uncertainties that arise from the inclusion of depressions of nonglacial origin, or those in regions where empirical data are sparse. To address the problem of overdeepening elongation, potential quality control criteria are introduced; and discussion of this example serves to highlight the limitations that mapping approaches - and applications of such approaches - must confront. We predict that improvements in bed-data quality will reduce the need for quality control procedures and facilitate increasingly robust insights from empirical data.

  11. A Practical and Automated Approach to Large Area Forest Disturbance Mapping with Remote Sensing

    PubMed Central

    Ozdogan, Mutlu

    2014-01-01

    In this paper, I describe a set of procedures that automate forest disturbance mapping using a pair of Landsat images. The approach is built on the traditional pair-wise change detection method, but is designed to extract training data without user interaction and uses a robust classification algorithm capable of handling incorrectly labeled training data. The steps in this procedure include: i) creating masks for water, non-forested areas, clouds, and cloud shadows; ii) identifying training pixels whose value is above or below a threshold defined by the number of standard deviations from the mean value of the histograms generated from local windows in the short-wave infrared (SWIR) difference image; iii) filtering the original training data through a number of classification algorithms using an n-fold cross validation to eliminate mislabeled training samples; and finally, iv) mapping forest disturbance using a supervised classification algorithm. When applied to 17 Landsat footprints across the U.S. at five-year intervals between 1985 and 2010, the proposed approach produced forest disturbance maps with 80 to 95% overall accuracy, comparable to those obtained from traditional approaches to forest change detection. The primary sources of mis-classification errors included inaccurate identification of forests (errors of commission), issues related to the land/water mask, and clouds and cloud shadows missed during image screening. The approach requires images from the peak growing season, at least for the deciduous forest sites, and cannot readily distinguish forest harvest from natural disturbances or other types of land cover change. The accuracy of detecting forest disturbance diminishes with the number of years between the images that make up the image pair. Nevertheless, the relatively high accuracies, little or no user input needed for processing, speed of map production, and simplicity of the approach make the new method especially practical for forest cover

  12. High-resolution geologic mapping of the inner continental shelf: Boston Harbor and approaches, Massachusetts

    USGS Publications Warehouse

    Ackerman, Seth D.; Butman, Bradford; Barnhardt, Walter A.; Danforth, William W.; Crocker, James M.

    2006-01-01

    This report presents the surficial geologic framework data and information for the sea floor of Boston Harbor and Approaches, Massachusetts (fig. 1.1). This mapping was conducted as part of a cooperative program between the U.S. Geological Survey (USGS), the Massachusetts Office of Coastal Zone Management (CZM), and the National Oceanic and Atmospheric Administration (NOAA). The primary objective of this project was to provide sea floor geologic information and maps of Boston Harbor to aid resource management, scientific research, industry and the public. A secondary objective was to test the feasibility of using NOAA hydrographic survey data, normally collected to update navigation charts, to create maps of the sea floor suitable for geologic and habitat interpretations. Defining sea-floor geology is the first steps toward managing ocean resources and assessing environmental changes due to natural or human activity. The geophysical data for these maps were collected as part of hydrographic surveys carried out by NOAA in 2000 and 2001 (fig. 1.2). Bottom photographs, video, and samples of the sediments were collected in September 2004 to help in the interpretation of the geophysical data. Included in this report are high-resolution maps of the sea floor, at a scale of 1:25,000; the data used to create these maps in Geographic Information Systems (GIS) format; a GIS project; and a gallery of photographs of the sea floor. Companion maps of sea floor to the north Boston Harbor and Approaches are presented by Barnhardt and others (2006) and to the east by Butman and others (2003a,b,c). See Butman and others (2004) for a map of Massachusetts Bay at a scale of 1:125,000. The sections of this report are listed in the navigation bar along the left-hand margin of this page. Section 1 (this section) introduces the report. Section 2 presents the large-format map sheets. Section 3 describes data collection, processing, and analysis. Section 4 summarizes the geologic history of

  13. Global land cover mapping at 30 m resolution: A POK-based operational approach

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Chen, Jin; Liao, Anping; Cao, Xin; Chen, Lijun; Chen, Xuehong; He, Chaoying; Han, Gang; Peng, Shu; Lu, Miao; Zhang, Weiwei; Tong, Xiaohua; Mills, Jon

    2015-05-01

    Global Land Cover (GLC) information is fundamental for environmental change studies, land resource management, sustainable development, and many other societal benefits. Although GLC data exists at spatial resolutions of 300 m and 1000 m, a 30 m resolution mapping approach is now a feasible option for the next generation of GLC products. Since most significant human impacts on the land system can be captured at this scale, a number of researchers are focusing on such products. This paper reports the operational approach used in such a project, which aims to deliver reliable data products. Over 10,000 Landsat-like satellite images are required to cover the entire Earth at 30 m resolution. To derive a GLC map from such a large volume of data necessitates the development of effective, efficient, economic and operational approaches. Automated approaches usually provide higher efficiency and thus more economic solutions, yet existing automated classification has been deemed ineffective because of the low classification accuracy achievable (typically below 65%) at global scale at 30 m resolution. As a result, an approach based on the integration of pixel- and object-based methods with knowledge (POK-based) has been developed. To handle the classification process of 10 land cover types, a split-and-merge strategy was employed, i.e. firstly each class identified in a prioritized sequence and then results are merged together. For the identification of each class, a robust integration of pixel-and object-based classification was developed. To improve the quality of the classification results, a knowledge-based interactive verification procedure was developed with the support of web service technology. The performance of the POK-based approach was tested using eight selected areas with differing landscapes from five different continents. An overall classification accuracy of over 80% was achieved. This indicates that the developed POK-based approach is effective and feasible

  14. Elastographic mapping in optical coherence tomography using an unconventional approach based on correlation stability.

    PubMed

    Zaitsev, Vladimir Y; Matveev, Lev A; Matveyev, Alexandr L; Gelikonov, Grigory V; Gelikonov, Valentin M

    2014-02-01

    An approach to elastographic mapping in optical coherence tomography (OCT) using comparison of correlation stability of sequentially obtained intensity OCT images of the studied strained tissue is discussed. The basic idea is that for stiffer regions, the OCT image is distorted to a smaller degree. Consequently, cross-correlation maps obtained with compensation of trivial translational motion of the image parts using a sliding correlation window can represent the spatial distribution of the relative tissue stiffness. An important advantage of the proposed approach is that it allows one to avoid the stage of local-strain reconstruction via error-sensitive numerical differentiation of experimentally determined displacements. Another advantage is that the correlation stability (CS) approach intrinsically implies that for deformed softer tissue regions, cross-correlation should already be strongly decreased in contrast to the approaches based on initial reconstruction of displacements. This feature determines a much wider strain range of operability than the proposed approach and is favorable for its free-hand implementation using the OCT probe itself to deform the tissue. The CS approach can be implemented using either the image elements reflecting morphological structure of the tissue or performing the speckle-level cross-correlation. Examples of numerical simulations and experimental demonstrations using both phantom samples and in vivo obtained OCT images are presented. PMID:24042446

  15. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce.

    PubMed

    Idris, Muhammad; Hussain, Shujaat; Siddiqi, Muhammad Hameed; Hassan, Waseem; Syed Muhammad Bilal, Hafiz; Lee, Sungyoung

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  16. MRPack: Multi-Algorithm Execution Using Compute-Intensive Approach in MapReduce

    PubMed Central

    2015-01-01

    Large quantities of data have been generated from multiple sources at exponential rates in the last few years. These data are generated at high velocity as real time and streaming data in variety of formats. These characteristics give rise to challenges in its modeling, computation, and processing. Hadoop MapReduce (MR) is a well known data-intensive distributed processing framework using the distributed file system (DFS) for Big Data. Current implementations of MR only support execution of a single algorithm in the entire Hadoop cluster. In this paper, we propose MapReducePack (MRPack), a variation of MR that supports execution of a set of related algorithms in a single MR job. We exploit the computational capability of a cluster by increasing the compute-intensiveness of MapReduce while maintaining its data-intensive approach. It uses the available computing resources by dynamically managing the task assignment and intermediate data. Intermediate data from multiple algorithms are managed using multi-key and skew mitigation strategies. The performance study of the proposed system shows that it is time, I/O, and memory efficient compared to the default MapReduce. The proposed approach reduces the execution time by 200% with an approximate 50% decrease in I/O cost. Complexity and qualitative results analysis shows significant performance improvement. PMID:26305223

  17. Turkers in Africa: A Crowdsourcing Approach to Improving Agricultural Landcover Maps

    NASA Astrophysics Data System (ADS)

    Estes, L. D.; Caylor, K. K.; Choi, J.

    2012-12-01

    In the coming decades a substantial portion of Africa is expected to be transformed to agriculture. The scale of this conversion may match or exceed that which occurred in the Brazilian Cerrado and Argentinian Pampa in recent years. Tracking the rate and extent of this conversion will depend on having an accurate baseline of the current extent of croplands. Continent-wide baseline data do exist, but the accuracy of these relatively coarse resolution, remotely sensed assessments is suspect in many regions. To develop more accurate maps of the distribution and nature of African croplands, we develop a distributed "crowdsourcing" approach that harnesses human eyeballs and image interpretation capabilities. Our initial goal is to assess the accuracy of existing agricultural land cover maps, but ultimately we aim to generate "wall-to-wall" cropland maps that can be revisited and updated to track agricultural transformation. Our approach utilizes the freely avail- able, high-resolution satellite imagery provided by Google Earth, combined with Amazon.com's Mechanical Turk platform, an online service that provides a large, global pool of workers (known as "Turkers") who perform "Human Intelligence Tasks" (HITs) for a fee. Using open-source R and python software, we select a random sample of 1 km2 cells from a grid placed over our study area, stratified by field density classes drawn from one of the coarse-scale land cover maps, and send these in batches to Mechanical Turk for processing. Each Turker is required to conduct an initial training session, on the basis of which they are assigned an accuracy score that determines whether the Turker is allowed to proceed with mapping tasks. Completed mapping tasks are automatically retrieved and processed on our server, and subject to two further quality control measures. The first of these is a measure of the spatial accuracy of Turker mapped areas compared to a "gold standard" maps from selected locations that are randomly

  18. A quasi-classical mapping approach to vibrationally coupled electron transport in molecular junctions

    SciTech Connect

    Li, Bin; Miller, William H.; Wilner, Eli Y.; Thoss, Michael

    2014-03-14

    We develop a classical mapping approach suitable to describe vibrationally coupled charge transport in molecular junctions based on the Cartesian mapping for many-electron systems [B. Li and W. H. Miller, J. Chem. Phys. 137, 154107 (2012)]. To properly describe vibrational quantum effects in the transport characteristics, we introduce a simple transformation rewriting the Hamiltonian in terms of occupation numbers and use a binning function to facilitate quantization. The approach provides accurate results for the nonequilibrium Holstein model for a range of bias voltages, vibrational frequencies, and temperatures. It also captures the hallmarks of vibrational quantum effects apparent in step-like structure in the current-voltage characteristics at low temperatures as well as the phenomenon of Franck-Condon blockade.

  19. Conceptualizing Stakeholders' Perceptions of Ecosystem Services: A Participatory Systems Mapping Approach

    NASA Astrophysics Data System (ADS)

    Lopes, Rita; Videira, Nuno

    2015-12-01

    A participatory system dynamics modelling approach is advanced to support conceptualization of feedback processes underlying ecosystem services and to foster a shared understanding of leverage intervention points. The process includes systems mapping workshop and follow-up tasks aiming at the collaborative construction of causal loop diagrams. A case study developed in a natural area in Portugal illustrates how a stakeholder group was actively engaged in the development of a conceptual model depicting policies for sustaining the climate regulation ecosystem service.

  20. Putting people on the map through an approach that integrates social data in conservation planning.

    PubMed

    Stephanson, Sheri L; Mascia, Michael B

    2014-10-01

    Conservation planning is integral to strategic and effective operations of conservation organizations. Drawing upon biological sciences, conservation planning has historically made limited use of social data. We offer an approach for integrating data on social well-being into conservation planning that captures and places into context the spatial patterns and trends in human needs and capacities. This hierarchical approach provides a nested framework for characterizing and mapping data on social well-being in 5 domains: economic well-being, health, political empowerment, education, and culture. These 5 domains each have multiple attributes; each attribute may be characterized by one or more indicators. Through existing or novel data that display spatial and temporal heterogeneity in social well-being, conservation scientists, planners, and decision makers may measure, benchmark, map, and integrate these data within conservation planning processes. Selecting indicators and integrating these data into conservation planning is an iterative, participatory process tailored to the local context and planning goals. Social well-being data complement biophysical and threat-oriented social data within conservation planning processes to inform decisions regarding where and how to conserve biodiversity, provide a structure for exploring socioecological relationships, and to foster adaptive management. Building upon existing conservation planning methods and insights from multiple disciplines, this approach to putting people on the map can readily merge with current planning practices to facilitate more rigorous decision making. PMID:25102957

  1. An approach to mapping haplotype-specific recombination sites in human MHC class III

    SciTech Connect

    Levo, A.; Westman, P.; Partanen, J.

    1996-12-31

    Studies of the major histocompatibility complex (MHC) in mouse indicate that the recombination sites are not randomly distributed and their occurrence is haplotype-dependent. No data concerning haplotype-specific recombination sites in human are available due to the low number of informative families. To investigate haplotype-specific recombination sites in human MHC, we describe an approach based on identification of recombinant haplotypes derived from one conserved haplotype at the population level. The recombination sites were mapped by comparing polymorphic markers between the recombinant and assumed original haplotypes. We tested this approach on the extended haplotype HLA A3; B47; Bf{sup *}F; C4A{sup *}1; C4B{sup *}Q0; DR7, which is most suitable for this analysis. First, it carries a number of rare markers, and second, the haplotype, albeit rare in the general population, is frequent in patients with 21-hydroxylase (21OH) defect. We observed recombinants derived from this haplotype in patients with 21OH defect. All these haplotypes had the centromeric part (from Bf to DR) identical to the original haplotype, but they differed in HLA A and B. We therefore assumed that they underwent recombinations in the segment that separates the Bf and HLA B genes. Polymorphic markers indicated that all break points mapped to two segments near the TNF locus. This approach makes possible the mapping of preferential recombination sites in different haplotypes. 20 refs., 1 fig., 1 tab.

  2. A simple approach to using an amorphous silicon EPID to verify IMRT planar dose maps.

    PubMed

    Lee, Christopher; Menk, Fred; Cadman, Patrick; Greer, Peter B

    2009-03-01

    A simplified method of verifying intensity modulated radiation therapy (IMRT) fields using a Varian aS500 amorphous silicon electronic portal imaging device (EPID) is demonstrated. Unlike previous approaches, it does not involve time consuming or complicated analytical processing of the data. The central axis pixel response of the EPID, as well as the profile characteristics obtained from images acquired with a 6 MV photon beam, was examined as a function of field size. Ion chamber measurements at various depths in a water phantom were then collected and it was found that at a specific depth d(ref), the dose response and profile characteristics closely matched the results of the EPID analysis. The only manipulation required to be performed on the EPID images was the multiplication of a matrix of off axis ratio values to remove the effect of the flood field calibration. Similarly, d(ref) was found for 18 MV. Planar dose maps at d(ref) in a water phantom for a bar pattern, a strip pattern, and 14 clinical IMRT fields from two patient cases each being from a separate anatomical region, i.e., head and neck as well as the pelvis, for both energies were generated by the Pinnacle planning system (V7.4). EPID images of these fields were acquired and converted to planar dose maps and compared directly with the Pinnacle planar dose maps. Radiographic film dosimetry and a MapCHECK dosimetry device (Sun Nuclear Corporation, Melbourne, FL) were used as an independent verification of the dose distribution. Gamma analysis of the EPID, film, and Pinnacle planar dose maps generated for the clinical IMRT fields showed that approximately 97% of all points passed using a 3% dose/3 mm DTA tolerance test. Based on the range of fields studied, the author's results appear to justify using this approach as a method to verify dose distributions calculated on a treatment planning system, including complex intensity modulated fields. PMID:19378759

  3. An assessment of a collaborative mapping approach for exploring land use patterns for several European metropolises

    NASA Astrophysics Data System (ADS)

    Jokar Arsanjani, Jamal; Vaz, Eric

    2015-03-01

    Until recently, land surveys and digital interpretation of remotely sensed imagery have been used to generate land use inventories. These techniques however, are often cumbersome and costly, allocating large amounts of technical and temporal costs. The technological advances of web 2.0 have brought a wide array of technological achievements, stimulating the participatory role in collaborative and crowd sourced mapping products. This has been fostered by GPS-enabled devices, and accessible tools that enable visual interpretation of high resolution satellite images/air photos provided in collaborative mapping projects. Such technologies offer an integrative approach to geography by means of promoting public participation and allowing accurate assessment and classification of land use as well as geographical features. OpenStreetMap (OSM) has supported the evolution of such techniques, contributing to the existence of a large inventory of spatial land use information. This paper explores the introduction of this novel participatory phenomenon for land use classification in Europe's metropolitan regions. We adopt a positivistic approach to assess comparatively the accuracy of these contributions of OSM for land use classifications in seven large European metropolitan regions. Thematic accuracy and degree of completeness of OSM data was compared to available Global Monitoring for Environment and Security Urban Atlas (GMESUA) datasets for the chosen metropolises. We further extend our findings of land use within a novel framework for geography, justifying that volunteered geographic information (VGI) sources are of great benefit for land use mapping depending on location and degree of VGI dynamism and offer a great alternative to traditional mapping techniques for metropolitan regions throughout Europe. Evaluation of several land use types at the local level suggests that a number of OSM classes (such as anthropogenic land use, agricultural and some natural environment

  4. Mapping Ds insertions in barley using a sequence-based approach.

    PubMed

    Cooper, L D; Marquez-Cedillo, L; Singh, J; Sturbaum, A K; Zhang, S; Edwards, V; Johnson, K; Kleinhofs, A; Rangel, S; Carollo, V; Bregitzer, P; Lemaux, P G; Hayes, P M

    2004-09-01

    A transposon tagging system, based upon maize Ac/Ds elements, was developed in barley (Hordeum vulgaresubsp. vulgare). The long-term objective of this project is to identify a set of lines with Ds insertions dispersed throughout the genome as a comprehensive tool for gene discovery and reverse genetics. AcTPase and Ds-bar elements were introduced into immature embryos of Golden Promise by biolistic transformation. Subsequent transposition and segregation of Ds away from AcTPase and the original site of integration resulted in new lines, each containing a stabilized Ds element in a new location. The sequence of the genomic DNA flanking the Ds elements was obtained by inverse PCR and TAIL-PCR. Using a sequence-based mapping strategy, we determined the genome locations of the Ds insertions in 19 independent lines using primarily restriction digest-based assays of PCR-amplified single nucleotide polymorphisms and PCR-based assays of insertions or deletions. The principal strategy was to identify and map sequence polymorphisms in the regions corresponding to the flanking DNA using the Oregon Wolfe Barley mapping population. The mapping results obtained by the sequence-based approach were confirmed by RFLP analyses in four of the lines. In addition, cloned DNA sequences corresponding to the flanking DNA were used to assign map locations to Morex-derived genomic BAC library inserts, thus integrating genetic and physical maps of barley. BLAST search results indicate that the majority of the transposed Ds elements are found within predicted or known coding sequences. Transposon tagging in barley using Ac/Ds thus promises to provide a useful tool for studies on the functional genomics of the Triticeae. PMID:15449176

  5. A geomorphological mapping approach for the assessment of seabed geohazards and risk

    NASA Astrophysics Data System (ADS)

    Hough, Gayle; Green, Jennifer; Fish, Paul; Mills, Andy; Moore, Roger

    2011-03-01

    Exploration and development of offshore hydrocarbon resources has advanced into remote deepwater regions over the last decade and poses significant technical challenges for the design and installation of wells and facilities at extreme water depths. Seafloor and shallow subsurface processes and conditions in these areas are complex and generally poorly understood, and the geohazards to development are larger scale and fundamentally different to those encountered onshore; consequently the geohazard risk to deepwater development projects is potentially significant and requires careful evaluation and mitigation during the front-end planning and engineering design stages of projects. There are no established industry standards or methods for the assessment of geohazards and engineering-quality geophysical data at the scale of development. The paper describes an integrated and systematic map-based approach for the assessment and mitigation of seabed geohazards and risk to proposed deepwater development. The approach employs a multi-disciplinary team working with engineering-quality field calibrated data to accurately map and assess seafloor ground conditions and ensure that development proposals are not exposed to intolerable geohazard risk. The approach taken is very similar to the practice of establishing geological models for land-based engineering projects, in which the complete geological history of the site is used to characterise and predict the performance of the ground. Such an approach is routine for major projects on land but so far does not seem to be common practice in the offshore industry. The paper illustrates the seafloor geomophological mapping approach developed. The products are being used to optimise development layouts to avoid geohazards where possible and to support site-specific engineering design of facilities based on a detailed understanding of the potential geohazard loadings and associated risk.

  6. Simulating spin-boson models with matrix product states

    NASA Astrophysics Data System (ADS)

    Wall, Michael; Safavi-Naini, Arghavan; Rey, Ana Maria

    2016-05-01

    The global coupling of few-level quantum systems (``spins'') to a discrete set of bosonic modes is a key ingredient for many applications in quantum science, including large-scale entanglement generation, quantum simulation of the dynamics of long-range interacting spin models, and hybrid platforms for force and spin sensing. In many situations, the bosons are integrated out, leading to effective long-range interactions between the spins; however, strong spin-boson coupling invalidates this approach, and spin-boson entanglement degrades the fidelity of quantum simulation of spin models. We present a general numerical method for treating the out-of-equilibrium dynamics of spin-boson systems based on matrix product states. While most efficient for weak coupling or small numbers of boson modes, our method applies for any spatial and operator dependence of the spin-boson coupling. In addition, our approach allows straightforward computation of many quantities of interest, such as the full counting statistics of collective spin measurements and quantum simulation infidelity due to spin-boson entanglement. We apply our method to ongoing trapped ion quantum simulator experiments in analytically intractable regimes. This work is supported by JILA-NSF-PFC-1125844, NSF-PIF- 1211914, ARO, AFOSR, AFOSR-MURI, and the NRC.

  7. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using

  8. Quantum contextuality in N-boson systems

    SciTech Connect

    Benatti, Fabio; Floreanini, Roberto; Genovese, Marco; Olivares, Stefano

    2011-09-15

    Quantum contextuality in systems of identical bosonic particles is explicitly exhibited via the maximum violation of a suitable inequality of Clauser-Horne-Shimony-Holt type. Unlike the approaches considered so far, which make use of single-particle observables, our analysis involves collective observables constructed using multiboson operators. An exemplifying scheme to test this violation with a quantum optical setup is also discussed.

  9. Higgs boson photoproduction at the LHC

    SciTech Connect

    Ducati, M. B. Gay; Silveira, G. G.

    2011-07-15

    We present the current development of the photoproduction approach for the Higgs boson with its application to pp and pA collisions at the LHC. We perform a different analysis for the Gap Survival Probability, where we consider a probability of 3% and also a more optimistic value of 10% based on the HERA data for dijet production. As a result, the cross section for the exclusive Higgs boson production is about 2 fb and 6 fb in pp collisions and 617 and 2056 fb for pPb collisions, considering the gap survival factor of 3% and 10%, respectively.

  10. Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches

    NASA Technical Reports Server (NTRS)

    Skinner, J. A., Jr.; Herkenhoff, K.

    2010-01-01

    The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].

  11. A branch-and-cut approach to physical mapping with end-probes

    SciTech Connect

    Christof, T.; Reinelt, G.; Juenger, M.

    1997-12-01

    A fundamental problem in computational biology is the construction of physical maps of chromosomes from hybridization experiments between unique probes and clones of chromosome fragments in the presence of error. Alizadeh, Karp, Weisser and Zweig [AKWZ94] first considered a maximum-likelihood model of the problem that is equivalent to finding an ordering of the probes that minimizes a weighted sum of errors, and developed several effective heuristics. We show that by exploiting information about the end-probes of clones, this model can be formulated as a weighted Betweenness Problem. This affords the significant advantage of allowing the well-developed tools of integer linear-programming and branch-and-cut algorithms to be brought to bear on physical mapping, enabling us for the first time to solve small mapping instances to optimality even in the presence of high error. We also show that by combining the optimal solution of many small overlapping Betweenness Problems, one can effectively screen errors from larger instances, and solve the edited instance to optimality as a Hamming-Distance Traveling Salesman Problem. This suggests a new combined approach to physical map construction. 18 refs., 13 figs.

  12. Oxidative status interactome map: towards novel approaches in experiment planning, data analysis, diagnostics and therapy.

    PubMed

    Zolotukhin, Peter; Kozlova, Yulia; Dovzhik, Anastasiya; Kovalenko, Konstantin; Kutsyn, Kseniya; Aleksandrova, Anzhela; Shkurat, Tatyana

    2013-08-01

    Experimental evidence suggests an immense variety of processes associated with and aimed at producing reactive oxygen and/or nitrogen species. Clinical studies implicate an enormous range of pathologies associated with reactive oxygen/nitrogen species metabolism deregulation, particularly oxidative stress. Recent advances in biochemistry, proteomics and molecular biology/biophysics of cells suggest oxidative stress to be an endpoint of complex dysregulation events of conjugated pathways consolidated under the term, proposed here, "oxidative status". The oxidative status concept, in order to allow for novel diagnostic and therapeutic approaches, requires elaboration of a new logic system comprehending all the features, versatility and complexity of cellular pro- and antioxidative components of different nature. We have developed a curated and regularly updated interactive interactome map of human cellular-level oxidative status allowing for systematization of the related most up-to-date experimental data. A total of more than 600 papers were selected for the initial creation of the map. The map comprises more than 300 individual factors with respective interactions, all subdivided hierarchically for logical analysis purposes. The pilot application of the interactome map suggested several points for further development of oxidative status-based technologies. PMID:23698602

  13. A reciprocal space approach for locating symmetry elements in Patterson superposition maps

    SciTech Connect

    Hendrixson, T.

    1990-09-21

    A method for determining the location and possible existence of symmetry elements in Patterson superposition maps has been developed. A comparison of the original superposition map and a superposition map operated on by the symmetry element gives possible translations to the location of the symmetry element. A reciprocal space approach using structure factor-like quantities obtained from the Fourier transform of the superposition function is then used to determine the best'' location of the symmetry element. Constraints based upon the space group requirements are also used as a check on the locations. The locations of the symmetry elements are used to modify the Fourier transform coefficients of the superposition function to give an approximation of the structure factors, which are then refined using the EG relation. The analysis of several compounds using this method is presented. Reciprocal space techniques for locating multiple images in the superposition function are also presented, along with methods to remove the effect of multiple images in the Fourier transform coefficients of the superposition map. In addition, crystallographic studies of the extended chain structure of (NHC{sub 5}H{sub 5})SbI{sub 4} and of the twinning method of the orthorhombic form of the high-{Tc} superconductor YBa{sub 2}Cu{sub 3}O{sub 7-x} are presented. 54 refs.

  14. An optimization approach for mapping and measuring the divergence and correspondence between paths.

    PubMed

    Mueller, Shane T; Perelman, Brandon S; Veinott, Elizabeth S

    2016-03-01

    Many domains of empirical research produce or analyze spatial paths as a measure of behavior. Previously, approaches for measuring the similarity or deviation between two paths have either required timing information or have used ad hoc or manual coding schemes. In this paper, we describe an optimization approach for robustly measuring the area-based deviation between two paths we call ALCAMP (Algorithm for finding the Least-Cost Areal Mapping between Paths). ALCAMP measures the deviation between two paths and produces a mapping between corresponding points on the two paths. The method is robust to a number of aspects in real path data, such as crossovers, self-intersections, differences in path segmentation, and partial or incomplete paths. Unlike similar algorithms that produce distance metrics between trajectories (i.e., paths that include timing information), this algorithm uses only the order of observed path segments to determine the mapping. We describe the algorithm and show its results on a number of sample problems and data sets, and demonstrate its effectiveness for assessing human memory for paths. We also describe available software code written in the R statistical computing language that implements the algorithm to enable data analysis. PMID:25737420

  15. Mapping Variable Width Riparian Zones Utilizing Open Source Data: A Robust New Approach

    NASA Astrophysics Data System (ADS)

    Abood, S. A.; Maclean, A.

    2013-12-01

    Riparian buffers are dynamic, transitional ecosystems between aquatic and terrestrial ecosystems with well-defined vegetation and soil characteristics. Previous approaches to riparian buffer delineation have primarily utilized fixed width buffers. However, these methodologies only take the watercourse into consideration and ignore critical geomorphology, associated vegetation and soil characteristics. Utilizing spatial data readily available from government agencies and geospatial clearinghouses, such as DEMs and the National Hydrography Dataset, the Riparian Buffer Delineation Model (RBDM) offers advantages by harnessing the geospatial modeling capabilities of ArcMap GIS, incorporating a statistically valid sampling technique along the watercourse to accurately map the critical 50-year plain, and delineating a variable width riparian buffer. Options within the model allow incorporation of National Wetlands Inventory (NWI), Soil Survey Data (SSURGO), National Land Cover Data (NLCD) and/or Cropland Data Layer (CDL) to improve the accuracy and utility of the riparian buffers attributes. This approach recognizes the dynamic and transitional natures of riparian buffers by accounting for hydrologic, geomorphic and vegetation data as inputs into the delineation process. By allowing the incorporation of land cover data, decision makers acquire a useful tool to assist in managing riparian buffers. The model is formatted as an ArcMap toolbox for easy installation and does require a Spatial Analyst license. Variable width riparian buffer utilizing 50-year flood height and 10m DEM. RBDM Inputs

  16. A new multicriteria risk mapping approach based on a multiattribute frontier concept.

    PubMed

    Yemshanov, Denys; Koch, Frank H; Ben-Haim, Yakov; Downing, Marla; Sapio, Frank; Siltanen, Marty

    2013-09-01

    Invasive species risk maps provide broad guidance on where to allocate resources for pest monitoring and regulation, but they often present individual risk components (such as climatic suitability, host abundance, or introduction potential) as independent entities. These independent risk components are integrated using various multicriteria analysis techniques that typically require prior knowledge of the risk components' importance. Such information is often nonexistent for many invasive pests. This study proposes a new approach for building integrated risk maps using the principle of a multiattribute efficient frontier and analyzing the partial order of elements of a risk map as distributed in multidimensional criteria space. The integrated risks are estimated as subsequent multiattribute frontiers in dimensions of individual risk criteria. We demonstrate the approach with the example of Agrilus biguttatus Fabricius, a high-risk pest that may threaten North American oak forests in the near future. Drawing on U.S. and Canadian data, we compare the performance of the multiattribute ranking against a multicriteria linear weighted averaging technique in the presence of uncertainties, using the concept of robustness from info-gap decision theory. The results show major geographic hotspots where the consideration of tradeoffs between multiple risk components changes integrated risk rankings. Both methods delineate similar geographical regions of high and low risks. Overall, aggregation based on a delineation of multiattribute efficient frontiers can be a useful tool to prioritize risks for anticipated invasive pests, which usually have an extremely poor prior knowledge base. PMID:23339716

  17. Improved effective vector boson approximation revisited

    NASA Astrophysics Data System (ADS)

    Bernreuther, Werner; Chen, Long

    2016-03-01

    We reexamine the improved effective vector boson approximation which is based on two-vector-boson luminosities Lpol for the computation of weak gauge-boson hard scattering subprocesses V1V2→W in high-energy hadron-hadron or e-e+ collisions. We calculate these luminosities for the nine combinations of the transverse and longitudinal polarizations of V1 and V2 in the unitary and axial gauge. For these two gauge choices the quality of this approach is investigated for the reactions e-e+→W-W+νeν¯ e and e-e+→t t ¯ νeν¯ e using appropriate phase-space cuts.

  18. Approximate gauge symmetry of composite vector bosons

    NASA Astrophysics Data System (ADS)

    Suzuki, Mahiko

    2010-08-01

    It can be shown in a solvable field theory model that the couplings of the composite vector bosons made of a fermion pair approach the gauge couplings in the limit of strong binding. Although this phenomenon may appear accidental and special to the vector bosons made of a fermion pair, we extend it to the case of bosons being constituents and find that the same phenomenon occurs in a more intriguing way. The functional formalism not only facilitates computation but also provides us with a better insight into the generating mechanism of approximate gauge symmetry, in particular, how the strong binding and global current conservation conspire to generate such an approximate symmetry. Remarks are made on its possible relevance or irrelevance to electroweak and higher symmetries.

  19. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  20. Scaling of noise correlations in one-dimensional lattice hard-core boson systems

    NASA Astrophysics Data System (ADS)

    He, Kai; Rigol, Marcos

    2011-03-01

    Noise correlations are studied for systems of hard-core bosons in one-dimensional lattices. We use an exact numerical approach based on the Bose-Fermi mapping and properties of Slater determinants. We focus on the scaling of the noise correlations with system size in superfluid and insulating phases, which are generated in the homogeneous lattice, with period-two superlattices, and with uniformly distributed random diagonal disorder. For the superfluid phases, the leading contribution is shown to exhibit a density independent scaling proportional to the system size, while the first subleading term exhibits a density dependent power-law exponent.

  1. Scaling of noise correlations in one-dimensional-lattice-hard-core-boson systems

    NASA Astrophysics Data System (ADS)

    He, Kai; Rigol, Marcos

    2011-02-01

    Noise correlations are studied for systems of hard-core bosons in one-dimensional lattices. We use an exact numerical approach based on the Bose-Fermi mapping and properties of Slater determinants. We focus on the scaling of the noise correlations with system size in superfluid and insulating phases, which are generated in the homogeneous lattice, with period-two superlattices and with uniformly distributed random diagonal disorder. For the superfluid phases, the leading contribution is shown to exhibit a density-independent scaling proportional to the system size, while the first subleading term exhibits a density-dependent power-law exponent.

  2. A direct approach to generalised multiple mapping conditioning for selected turbulent diffusion flame cases

    NASA Astrophysics Data System (ADS)

    Sundaram, Brruntha; Klimenko, Alexander Yuri; Cleary, Matthew John; Ge, Yipeng

    2016-07-01

    This work presents a direct and transparent interpretation of two concepts for modelling turbulent combustion: generalised Multiple Mapping Conditioning (MMC) and sparse-Lagrangian Large Eddy Simulation (LES). The MMC approach is presented as a hybrid between the Probability Density Function (PDF) method and approaches based on conditioning (e.g. Conditional Moment Closure, flamelet, etc.). The sparse-Lagrangian approach, which allows for a dramatic reduction of computational cost, is viewed as an alternative interpretation of the Filtered Density Function (FDF) methods. This work presents simulations of several turbulent diffusion flame cases and discusses the universality of the localness parameter between these cases and the universality of sparse-Lagrangian FDF methods with MMC.

  3. A sib-pair approach to interval mapping a quantitative trait loci

    SciTech Connect

    Fulker, K.W. ); Cardon, L.R. )

    1994-06-01

    An interval mapping procedure based on the sib-pair method of Haseman and Elston is developed, and simulation studies are carried out to explore its properties. The procedure is analogous to other interval mapping procedures used with experimental material, such as plants and animals, and yields very similar results in terms of the location and effects size of a quantitative trait locus (QTL). The procedure offers an advantage over the conventional Haseman and Elston approach, in terms of power, and provides useful information concerning the location of a QTL. Because of its simplicity, the method readily lends itself to the analysis of selected samples for increased power and the evaluation of multilocus models of complex phenotypes. 26 refs., 4 figs., 5 tabs.

  4. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  5. A GIS based method for soil mapping in Sardinia, Italy: a geomatic approach.

    PubMed

    Vacca, A; Loddo, S; Melis, M T; Funedda, A; Puddu, R; Verona, M; Fanni, S; Fantola, F; Madrau, S; Marrone, V A; Serra, G; Tore, C; Manca, D; Pasci, S; Puddu, M R; Schirru, P

    2014-06-01

    A new project was recently initiated for the realization of the "Land Unit and Soil Capability Map of Sardinia" at a scale of 1:50,000 to support land use planning. In this study, we outline the general structure of the project and the methods used in the activities that have been thus far conducted. A GIS approach was used. We used the soil-landscape paradigm for the prediction of soil classes and their spatial distribution or the prediction of soil properties based on landscape features. The work is divided into two main phases. In the first phase, the available digital data on land cover, geology and topography were processed and classified according to their influence on weathering processes and soil properties. The methods used in the interpretation are based on consolidated and generalized knowledge about the influence of geology, topography and land cover on soil properties. The existing soil data (areal and point data) were collected, reviewed, validated and standardized according to international and national guidelines. Point data considered to be usable were input into a specific database created for the project. Using expert interpretation, all digital data were merged to produce a first draft of the Land Unit Map. During the second phase, this map will be implemented with the existing soil data and verified in the field if also needed with new soil data collection, and the final Land Unit Map will be produced. The Land Unit and Soil Capability Map will be produced by classifying the land units using a reference matching table of land capability classes created for this project. PMID:24315681

  6. A Probabilistic Approach to Receptive Field Mapping in the Frontal Eye Fields

    PubMed Central

    Mayo, J. Patrick; Morrison, Robert M.; Smith, Matthew A.

    2016-01-01

    Studies of the neuronal mechanisms of perisaccadic vision often lack the resolution needed to determine important changes in receptive field (RF) structure. Such limited analytical power can lead to inaccurate descriptions of visuomotor processing. To address this issue, we developed a precise, probabilistic technique that uses a generalized linear model (GLM) for mapping the visual RFs of frontal eye field (FEF) neurons during stable fixation (Mayo et al., 2015). We previously found that full-field RF maps could be obtained using 1–8 dot stimuli presented at frame rates of 10–150 ms. FEF responses were generally robust to changes in the number of stimuli presented or the rate of presentation, which allowed us to visualize RFs over a range of spatial and temporal resolutions. Here, we compare the quality of RFs obtained over different stimulus and GLM parameters to facilitate future work on the detailed mapping of FEF RFs. We first evaluate the interactions between the number of stimuli presented per trial, the total number of trials, and the quality of RF mapping. Next, we vary the spatial resolution of our approach to illustrate the tradeoff between visualizing RF sub-structure and sampling at high resolutions. We then evaluate local smoothing as a possible correction for situations where under-sampling occurs. Finally, we provide a preliminary demonstration of the usefulness of a probabilistic approach for visualizing full-field perisaccadic RF shifts. Our results present a powerful, and perhaps necessary, framework for studying perisaccadic vision that is applicable to FEF and possibly other visuomotor regions of the brain. PMID:27047352

  7. Bosonization of Weyl Fermions

    NASA Astrophysics Data System (ADS)

    Marino, Eduardo

    The electron, discovered by Thomson by the end of the nineteenth century, was the first experimentally observed particle. The Weyl fermion, though theoretically predicted since a long time, was observed in a condensed matter environment in an experiment reported only a few weeks ago. Is there any linking thread connecting the first and the last observed fermion (quasi)particles? The answer is positive. By generalizing the method known as bosonization, the first time in its full complete form, for a spacetime with 3+1 dimensions, we are able to show that both electrons and Weyl fermions can be expressed in terms of the same boson field, namely the Kalb-Ramond anti-symmetric tensor gauge field. The bosonized form of the Weyl chiral currents lead to the angle-dependent magneto-conductance behavior observed in these systems.

  8. Coulomb problem for vector bosons

    SciTech Connect

    Kuchiev, M.Yu.; Flambaum, V.V.

    2006-05-01

    The Coulomb problem for vector bosons W{sup {+-}} incorporates a well-known difficulty; the charge of the boson localized in a close vicinity of the attractive Coulomb center proves to be infinite. The paradox is shown to be resolved by the QED vacuum polarization, which brings in a strong effective repulsion that eradicates the infinite charge of the boson on the Coulomb center. This property allows one to define the Coulomb problem for vector bosons properly.

  9. Functional Connectivity-Based Parcellation of Amygdala Using Self-Organized Mapping: A Data Driven Approach

    PubMed Central

    Mishra, Arabinda; Rogers, Baxter P.; Chen, Li Min; Gore, John C.

    2013-01-01

    The overall goal of this work is to demonstrate how resting state functional magnetic resonance imaging (fMRI) signals may be used to objectively parcellate functionally heterogeneous subregions of the human amygdala into structures characterized by similar patterns of functional connectivity. We hypothesize that similarity of functional connectivity of subregions with other parts of the brain can be a potential basis to segment and cluster voxels using data driven approaches. In this work, self-organizing map (SOM) was implemented to cluster the connectivity maps associated with each voxel of the human amygdala, thereby defining distinct subregions. The functional separation was optimized by evaluating the overall differences in functional connectivity between the subregions at group level. Analysis of 25 resting state fMRI data sets suggests that SOM can successfully identify functionally independent nuclei based on differences in their inter subregional functional connectivity, evaluated statistically at various confidence levels. Although amygdala contains several nuclei whose distinct roles are implicated in various functions, our objective approach discerns at least two functionally distinct volumes comparable to previous parcellation results obtained using probabilistic tractography and cytoarchitectonic analysis. Association of these nuclei with various known functions and a quantitative evaluation of their differences in overall functional connectivity with lateral orbital frontal cortex and temporal pole confirms the functional diversity of amygdala. The data driven approach adopted here may be used as a powerful indicator of structure–function relationships in the amygdala and other functionally heterogeneous structures as well. PMID:23418140

  10. Toward a Materials Genome Approach for ionic liquids: synthesis guided by ab initio property maps.

    PubMed

    Yan, Fangyong; Lartey, Michael; Jariwala, Kuldeep; Bowser, Sage; Damodaran, Krishnan; Albenze, Erik; Luebke, David R; Nulwala, Hunaid B; Smit, Berend; Haranczyk, Maciej

    2014-11-26

    The Materials Genome Approach (MGA) aims to accelerate development of new materials by incorporating computational and data-driven approaches to reduce the cost of identification of optimal structures for a given application. Here, we use the MGA to guide the synthesis of triazolium-based ionic liquids (ILs). Our approach involves an IL property-mapping tool, which merges combinatorial structure enumeration, descriptor-based structure representation and sampling, and property prediction using molecular simulations. The simulated properties such as density, diffusivity, and gas solubility obtained for a selected set of representative ILs were used to build neural network models and map properties for all enumerated species. Herein, a family of ILs based on ca. 200,000 triazolium-based cations paired with the bis(trifluoromethanesulfonyl)amide anion was investigated using our MGA. Fourteen representative ILs spreading the entire range of predicted properties were subsequently synthesized and then characterized confirming the predicted density, diffusivity, and CO2 Henry's Law coefficient. Moreover, the property (CO2, CH4, and N2 solubility) trends associated with exchange of the bis(trifluoromethanesulfonyl)amide anion with one of 32 other anions were explored and quantified. PMID:25356930